Introduction

In 2018 The Open University Library undertook a project to review some of its higher-cost resources, partly to show justification for the Library’s existing budget but also to look at how the content aligned with the University’s 2018 curriculum plan. A project team consisting of the Senior Library Manager (Content & Licensing), an E-content Advisor (Academic Librarian) and a Senior Library Assistant was given eight months to carry out this work, along with other tasks including a staffing requirements analysis and literature reviews of Library budget management and research use of Library content. The team agreed to start the project by developing a methodology for our resource reviews and using this to evaluate the resources identified as higher-cost content.

As part of this work, we discussed how we should go about conducting the reviews and decided to begin by asking ourselves the question, ‘Why are reviews like this needed?’

One of our starting points was to look at the international project TERMS (Techniques for Electronic Resources Management) which began in 2008 and grew out of a discussion between the TERMS authors over a lack of consistency in e-resource management practices. TERMS aimed to set out an e-resource life cycle and to define a set of best practice using real world examples gathered from libraries in the UK, US and worldwide. For many years TERMS was hosted as a Tumblr blog, so regular updates could be added. The documents in TERMS discussed how one key part of the life cycle is ongoing evaluation to ensure the resource remains of value to the institution in satisfying its research and teaching aims. TERMS suggests that once purchased, resources should be reviewed between three and five years after first purchase and then annually thereafter:

‘The best evaluation of a product or service happens within a three- to five-year time frame. The arc of usage and user behavior is not fully realized until the third year of activity for any given resource or service. Evaluation of user behavior and usage data is important in building up a detailed picture of the appropriateness of the resource over time and is invaluable when it comes time to review the resource in the future’.

The methodology we have developed takes the format of a series of questions about the resource to be evaluated and suggests the data to use for each stage of the evaluation. It was developed out of the Library’s existing methods for evaluating our content using a lighter-touch annual resource review consisting of recording use (usually COUNTER data) for each resource, using metrics to show cost per use and providing minimal background information on the resource such as when it was purchased and for what purpose it is used.

It must be pointed out that the current methodology, as detailed below, has a significant weakness in that it does not involve any qualitative evaluation to ask users what their resource needs are and if they feel that their needs are being met by the resources provided. The lack of qualitative data to support resource reviews needs to be addressed by further research.

The methodology questions

We developed a list of questions to be answered for each resource. These were produced through workshops with other Senior Library Managers and initially reflected the questions we wanted to ask about large journal deal packages, but later evolved to also examine other higher-cost resources, such as e-book subscription packages.

The questions were:

  1. What is the resource called?
  2. What is the resource?
  3. How long have we had the resource?
  4. Why did we buy the resource initially?
  5. How does use of the resource now differ from then?
  6. What does the resource cost?
  7. How do we currently buy the resource?
  8. Is there any other way to buy the resource?
  9. Are their options for downsizing within the current licence?
  10. Would downsizing have any implications for staff time/cost?
  11. What do we retain if we stop subscribing to the resource?
  12. Who is paying for the resource?
  13. Who is using the resource?
  14. Why are they using the resource?
  15. What are the trends in usage?
  16. Is there a similar resource? If yes, then how do the two compare? What is the relative use of each?
  17. What would be the impact of not having the resource?
  18. Do other universities have the resource?

Each question was then answered using a mix of the quantitative data and other acquisition and licence records we hold. The data was stored in an Excel workbook created for each resource with a summary worksheet, and then other worksheets that provided tables, graphs and charts of the raw data. These files were then stored in our document management system, where anyone in the Library may access and review them.

Gathering the data and completing the reviews

In terms of gathering background information, it took a significant amount of time to track some resources through our records. We have extensive holdings of e-mail data, back to 2000, and large collections of paper licence files on most resources. Other paper records provided information further back, and for most online resources their original purchase is unlikely to be earlier than 1996 in the online format, although some were available previously as printed indexes or CD-ROMs. As we realized during the project how hard it can be to find this information, we now plan to record the rationale for purchase of a resource in our acquisition system and investigate in greater depth the possibility of storing licences for the product in electronic format.

Subject coverage

It has proved very useful to document more information about the resources and which subject areas they cover. The project has uncovered a potential lack of awareness of the sheer breadth and benefits of some resources the Library bought more than five years ago. If people are not aware of the content and coverage of the resources we already have, there is a danger they will buy very similar content from another provider, leading to duplication. If we understand the full content of what we have purchased, it will also help us to actively sell the benefits of our resources to faculty and students. These reviews should help in this process.

In order to understand how our collections aligned to the University’s 2018 curriculum plan we looked at which subjects the resource covered and how these matched Open University teaching and research areas. Currently, the University has been classifying courses by JACS (Joint Academic Coding System) codes but is in the process of transferring to HECoS (Higher Education Classification of Subjects) codes. As this work was still under way, we decided to adopt the broader subject areas used for classifying and recording student numbers on the University’s annual facts and figures reports. This enabled us to match those subject areas for all higher-cost resources and link it with student numbers in those subject areas over a seven-year period. Further work may be required to align resources with HECoS codes once fully adopted by the University.

Cost analysis

The cost information for the last four academic years was available in the library management system (LMS), ExLibris’ Alma, and so finding the information about the resource costs was relatively straightforward. This information was key to any cost-per-use analysis. The harder information to obtain in terms of costs was information about what individual journals or packages would cost outside deals. Sometimes even the providers were unable to give us such information easily. The information for the past four academic years about which faculty fund pays for the resource is held in our LMS. Any earlier data tends to be discoverable through e-mail records.

The data on how the Library buys the resource was held within our file of e-mail data and large holdings of paper licence files. For some resources, we were also able to refer to the active deal information on, for example, the Jisc website (the licence subscriptions manager site for UK academic libraries). In most cases, it is easy to distinguish which deal we have, but where there are complex deal offers it is not always clear exactly which elements of a deal we may have in place. In future this needs to be made more explicit on the order sheets when we make an order. The Library needs a systematic way of recording the elements of a deal we have taken when there is more than one option and this needs to be recorded in the LMS and with the original licence for the product.

Generally, for the bigger deals you could answer the question ‘Is there any other way to buy the resource?’ by looking at details of the deal. In some cases, you could find alternative options on the publisher or supplier’s website. However, finding any cost information was harder without asking the provider directly. In most cases, the Library tended to buy the resource in the most cost-effective way at the time, although there were some resources where an alternative provider also offered the same collection for a similar cost, and the rationale for purchase might then be dictated more by the platform the product was available on. We recommend that in future, each renewal should include allowance for the time and effort needed to look at the alternative suppliers for any products.

When we looked at the implications for staff time or costs in changing how the Library bought a resource, the question was only answerable from experience of where we had previously changed elements of a deal or collection. Often even a simple change can lead to requirements in staff time in realigning access for the collection within the LMS. When there have been significant changes in the past, such as moving CRC Press content from subscriptions to an evidence-based access model, we have written up a recommendations document that details the implications of the change. Any significant moves or changes should be documented as case studies to supply information for future reference on how these affect staff time/costs.

Use in teaching

A significant amount of data on use of the resources was from our EZproxy log-in data and, using this, we could see which faculties were using each resource. We were also able to see whether the log-ins came from staff or students. The EZproxy data can also be interrogated further in terms of module use, though the time required to analyse the logs was not feasible in the scope of this project but has been used when we have looked at the impact of cancelling a resource in the past.

We use an in-house system called LibLink to record Library links being used in VLEs (virtual learning environments). This was particularly helpful for the large journal collections, where we can see if they use a general link to a journal title or more granular links to individual articles. We obtained data on submissions to the Open Research Online (ORO), the University’s Open Access Repository, to act as a proxy for some element of research use for journal packages but, as mentioned previously, the only way to really answer this question is to ask the researchers themselves. A possible change in how we manage links for modules (e.g. moving to reading-list software) would have implications for this kind of data gathering going forward.

For usage trends, the previous light-touch annual resource reviews held usage data back for several years (in many cases to the financial year 2012/13) and so were able to provide the usage data trends needed. What this project has shown is the value in retaining this data, which needs to be collected in a systematic and regularly occurring way. Like most Libraries, we rely on COUNTER data and have benefited greatly from the introduction of the JUSP (Jisc Usage Statistics Portal) as a method of harvesting, storing and maintaining COUNTER data for many of our resources. JUSP even provides trend reporting as a standard report and as a graphic.

In analysing the first group of resources, we looked at other potentially overlapping resources (only likely to apply to aggregated sources, since large journal collections don’t overlap with each other). We used overlap analysis tools in our LMS, checking whether the titles were covered in any of our major aggregator collections. As the reviews progressed, it became clear that most resources had excellent use across the University and it would be unlikely we would wish to withdraw any of the major packages. However, there may be levels of overlap between aggregated sources and subject indexes that could be worth investigating. For database collections which are abstracts and indexes, we checked whether we have a collection covering the same subject area or looked at how the subjects covered would be indexed by Scopus or Web of Science. For book collections, we did an overlap analysis comparing the collection with aggregators such as Credo or Academic Complete. Another part of the reviews looked at this in detail and a separate overlap methodology was followed. As new resources are recommended, we need to actively analyse potential overlap with existing collections before purchase. This is especially important where the new resource is aggregated or a subject index.

Impact of loss of access

To look at the impact of not having the resource at all, we carried out a detailed analysis of usage at title level. This involved looking at the use of all titles included in a journal package and obtaining a list from the publishers of their current journal list prices (the price charged outside of a deal) to show the cost-per-use of each title. We then looked at how many of the titles, from the top-used titles down, we would be able to purchase at the same cost as that of the current deal. That also enabled us to see which titles we would no longer have access to if we had only bought the top-used titles. This is particularly relevant for the journal packages where we still buy individual subscriptions. Where the Library pays an access-fee only for some titles, they would be lost if we cancelled the deal.

Post-cancellation access is covered by the licences in many of the larger deals, but the title-by-title information and implications of cancelling a deal are very detailed, and specific to the individual collection. For some of the deals, we were able to give broad guidance on which years the Library would retain access for but, if a cancellation were to be made, each collection would need to be looked at in significant detail. Opportunities to find ways of managing post-cancellation access should be investigated, for example by participating in projects by Jisc and EDINA to help in this regard.

Benchmarking

Some benchmarking data for the larger collections was available in JUSP and it was possible to see which of the large University groups had access to the collection and the use they made of it (in terms of overall usage figures). Using connections in other institutions, or by checking other library websites, it was possible to see who else had a subscription. Jisc Collections have also previously given us information on subscriptions to their deals by different Jisc bands.

Recommendations

We were very lucky as a project team to be given the time and resources to undertake this project, but needed to think carefully about how such work could be embedded within the Library’s day-to-day work. Some of the lessons learned related to thinking ahead when purchasing resources and recording information in our LMS acquisition module. This would include information such as the purchasing rationale, why the content has been renewed and any analysis of alternative products that were carried out during the initial evaluation of the product. We have also found it very beneficial to align our resources to the subjects in the curriculum plan as it has proved helpful in demonstrating the Library’s effectiveness to support students across the University. Another important finding was how it can be helpful for future collection development planning to evidence our earlier decisions about purchase, cancellation, or otherwise changing how we buy content. This can build a knowledge bank of best practice for similar decisions we have to make about our content.

Until this project was carried out, there had been no full examination of some of the overlap between the Library’s content. The project team took advantage of the overlap analysis tools in Ex Libris’ Alma to run comparisons between large collections and work out the unique selling point of particular resources to the Library’s collections. This is something we recommend is carried out before any purchase the Library makes in the future. In terms of recommendations for publishers and suppliers, we suggest that thought is given to making it easier to see the exact composition of some of the larger packages they sell, ensuring they provide price lists for collections and individual titles on their websites, and especially provide details of post-cancellation access arrangements and our current holdings. For example, some publishers have provided us with Excel worksheets detailing our entitlements for their content, both front-file, back-file and archive content, which has enabled us to look at gaps in our holdings.

Conclusion

The development of this methodology has helped refine and enhance existing resource reviews and collection development policies. It has led to better understanding of the overall collections we offer and helped us to realize some efficiencies in data analysis and collection methods.

Following the methodology enabled us to examine in greater detail the higher-cost resources the Library buys. Some elements of the methodology need to be regularly updated (e.g. annually), whereas others will just need minimal updates as the resource evolves (e.g. the information about what it is). One caveat on the work carried out so far is that the methodology has been primarily used to analyse the large journal packages. Some of the questions would not require the same level of analysis for other types of resources, e.g. subject indexes, where there would not necessarily be an option to downsize, or any post-cancellation access.

The project team were able to use time spent during the project to calculate how much more time would be required to use the methodology to review other resources (that we called mid-cost content), and this has been used to develop further staff resource planning documents. We have since completed many more reviews and provided training to other staff in the team to enable them to keep the current reviews updated and create new ones when necessary.