E-book audit project

Can I have a refund?

‘I pay £9,000 a year tuition fees and two thirds of the books on my reading list are unsuitable for my access needs. Can I have a refund?’

It’s a good question. Around ten per cent of students in higher education (HE) have a print impairment. They are unable to access printed text, commonly due to a visual or physical impairment, or a specific learning difficulty such as dyslexia. They can experience significant barriers when accessing information for their studies and this can negatively impact attainment.

In 2016 a group of library/disability professionals from across UK HE, Jisc and representatives from e-book suppliers collaborated on a crowdsourced audit of e-book accessibility. The focus was on e-books for the UK education sector rather than e-books for mainstream commercial consumption.

The group formed through LIS-ACCESSIBILITY, a JISCMail list for sharing best practice around supporting disabled students. Through discussions on the list, a need was identified for a practical tool to benchmark e-book accessibility.

Why the problem?

Well-designed and inclusive e-books can address the barriers to information experienced by students with print impairments. Electronic formats should be adaptable to the specific needs of individual students. Unfortunately, this is not always the case and many students find that e-books are inaccessible to them. For example, many students with dyslexia need to change the background and font colour; people with visual impairments may need to zoom text to a high level and for the text to reflow (i.e. automatically adjust to fit the page at the new zoom level). These features are not consistently available across all platforms.

As illustrated by Figure 1, there are various potential reasons for this and factors affecting the accessibility of the end-user experience; ‘the publisher is merely the beginning of the journey’.

Figure 1 

Flow diagram showing potential ‘accessibility attrition’ points

Wider context

Previous work has investigated the accessibility of e-books provided to the UK HE sector. For example, the Open University Library conduct accessibility checking of the online resources they subscribe to and provide accessibility information to users via the library website, as well as providing feedback to publishers.

In 2010 Jisc TechDis, Jisc Collections and the Publishers Licensing Society sponsored a small-scale project to explore e-book platform accessibility and provide guidance for publishers and aggregators. Participants completed a series of tests designed to replicate typical user activities using different access technologies. The research showed wide variability in the ease with which testers could complete each task, depending on the access technology and platform they were using. Various recommendations were made to providers on how to improve the user experience, and the authors identified ‘a need to raise awareness of accessibility issues; increase the capacity to assess the accessibility of e-book platforms and to support the publishing industry in developing expertise in the accessibility of e-books’.

Jisc TechDis subsequently carried out further research with a grant from CLAUD. This involved a survey of HE library requests for resources in alternative formats, one aspect of which covered the accessibility of participants’ e-book systems. Key findings were uncertainty amongst respondents about the accessibility of their e-book platforms and partial accessibility of most platforms. The authors highlight ‘a responsibility for library staff to be informed about the products they procure and promote to learners – specifically about the basics of accessibility like font size/colour/reflow, keyboard access, text to speech and screen-reader access’.

In 2014 the University of Bradford conducted a systematic accessibility audit of their most used electronic resources, including e-book platforms. It was found that text reflow did not work in the majority of cases. Another issue identified was the production of PDFs as images rather than scanned text, meaning that many accessibility features do not work. Furthermore, many of the resources tested lacked an inbuilt tool for reading text aloud. The authors outline a duty for libraries to raise publishers’ awareness of accessibility issues with their products. They also urge others to repeat and build on the project, to open the conversation on accessibility with suppliers, and for libraries and publishers to work together to address accessibility needs.

Also in 2014, outside the UK, San Jose State University conducted an E-book Accessibility Project evaluating the accessibility of 16 major academic e-book platforms. The project found that the platforms tested lacked accessibility features offered by commercial e-book providers. The authors encourage librarians to ‘let publishers know that accessibility is a major consideration in their e-book adoptions and urge their compliance with common accessibility standards’.

More recently, a 2015 Northern Collaboration study in the UK looked at the experience of student use of e-books on mobile devices. The study used a four-part matrix, examining the licence, platform, mobile interface and accessibility for five key e-book suppliers. There is considerable overlap between mobile accessibility and accessibility for disabled users, as acknowledged by Jisc TechDis: ‘The clean, flexible, consistent interfaces that support high levels of disabled access are very similar to those required for use on a mobile browser.’ A key finding was that online e-book readers require third-party software which is not installed as standard on all universities’ PCs. The study also found that it is not only e-book platforms and files that constrain usability and accessibility, but also the limitations of the software or app (chosen by the supplier) used to access downloaded e-books. For example, four of the five providers use Adobe Digital Editions, which has very limited functionality, to open e-books downloaded as PDF on a PC. The authors of the study report note that e-book accessibility is particularly pertinent given recent changes to Disabled Students’ Allowance funding, and make a number of recommendations, including further testing of e-book accessibility and sharing the findings of the paper with suppliers ‘to encourage dialogue within the whole community in order to improve the user experience’.

Existing guidelines mean little to those who matter

Accessibility guidelines for web content are already in existence. However, these are beyond the technical understanding of most stakeholders: like those responsible for procuring e-books, creating reading lists or supporting students with e-books. Further, technical accessibility standards may not always translate to an accessible end-user experience.

A need was identified for a tool to assess the accessibility of e-books which could:

  • bridge this gap with a shared framework of simple, easy-to-measure criteria
  • reflect the end-user perspective
  • be used by experts and non-experts alike to discuss accessibility and define areas for improvement.

The project team considered using Web2Access, a resource which offers users the ability to view, create and submit accessibility reviews of Web 2.0 services. However, it was decided that a tool developed specifically for e-books with non-experts in mind would be more appropriate.

‘Show me…’ – the language of empowerment

Key to the success of the project was a framework for librarians and providers to discuss accessibility in a demonstrable way. ‘We are W3C compliant’ is easy for a supplier to claim and hard for a purchaser to argue. By developing easily demonstrated criteria, purchasers are empowered. For librarians to engage with this framework, they need to be able to empathize with disabled learners by understanding what makes an accessible end-user experience.

Crowdsourcing the study

To ensure maximum engagement, shared workload and scalability, the project took a crowdsourcing approach, with the aim of involving as many people as possible from across the sector.

The LIS-ACCESSIBILITY mailing list was central to building this community. It was used to identify the most widely used e-book platforms in order to decide which platforms to focus on in the audit. In addition to drawing on the criteria used in previous studies, LIS-ACCESSIBILITY was also used to elicit suggestions on questions to include in the audit tool and to gather feedback to refine the questions, as well as volunteers for conducting the audits.

The initial project team also joined forces with the National Consortia for Monographs e-books sub group, which, following the recent Northern Collaboration study, was also looking to audit e-book platforms for accessibility, with a view to informing procurement decisions. People volunteering to take part in the audit came from 33 institutions, representing 20% of the sector, along with five from e-book providers/suppliers.

NoWAL training event

To introduce key accessibility concepts to those involved in the audit, a training event was organized for members of the NoWAL (North West Academic Libraries) consortium. This included an overview of the project, key components of accessibility, a demonstration and hands-on practice of how to test e-books against the different audit questions.

Guidance – hints and tips

To supplement the training event (and support the wider audit), the project team put together comprehensive online guidance for completing the audit questions. This took various forms.

Within the online audit tool there was brief guidance on answering each question. A ‘Hints and Tips’ document was also developed, containing general guidance on topics such as choosing an e-book to audit, accessing the e-book platform, the level of evaluation required and where to get further help. It also contains tips on answering specific questions, linked to the relevant sections of the audit tool to ensure that support was available at the point of need. Tips include advice on distinguishing the platform interface from e-book content and identifying the different e-book formats available.

More in-depth training materials were also developed for some questions and included in the ‘Hints and Tips’ document. For example, for questions 2H – 2L (‘How do the original colours on the platform interface and e-book content perform against WCAG 2.0 colour contrast success criteria?’), an online video was produced explaining how to test colour contrast using a free tool available online. Video demonstrations were also produced to explain how to check for the presence of a ‘skip to content’ or skip navigation’ link on the platform interface, and how to check images for alternative text labels.

Issues encountered

Lack of text-to-speech confidence

Although the audit questions were designed for accessibility non-experts, the project team had failed to anticipate that some volunteers would have no experience at all of using assistive technology to read e-books aloud. Whilst it was suspected that many people would not be familiar with screen-reading technology, it was assumed that everyone would at least have access to text-to-speech software. This was not the case. Some had no access to any suitable software and many of those who did lacked the experience and confidence to use it to complete the relevant audit questions. Thus several audits were completed with missing data for the questions in Section 4: Reading text aloud. The project team filled some of the gaps by completing screen-reader and text-to-speech testing for platforms with missing data so as to avoid skewing results.

However, this issue highlighted an important point: there is a need for further ‘user-education’ across the sector on text-to-speech technology. This is important in the learning experience of many disabled students. Screen reading software is essential for blind students to access digital information, text-to-speech similarly so for many dyslexic students. As people with print impairments comprise a significant proportion of the student population and the role of libraries is to facilitate access to knowledge, it would be valuable for this gap in understanding to be addressed. It is especially relevant for frontline staff but acquisitions staff need to take account of it in licensing discussions. Text-to-speech has the potential to benefit all users.

Lack of screen-reader skill

The team were not surprised by the few respondents who completed screen-reader testing, as it is a skill that takes time and training to acquire. A scoring method was devised that adjusted for absent screen-reader scores. However the implication is that most scores are potentially a little more positive than they would otherwise be. It is expected that the screen-reader experience will be poor on many platforms – it requires more expertise to achieve – resulting in a depressed score but, wherever the question was unanswered, the statistical adjustment came into play, artificially raising the score.

Making results meaningful

The audit generated a wealth of data but not all library staff or disability support staff are equally comfortable with handling spreadsheets of this magnitude. Part of the purpose of the audit was to help staff understand that accessibility is ‘context sensitive’ – what is inaccessible to one learner may be more than adequate for another. The team developed an interactive way to show how accessibility scores could vary depending on the criteria that a learner required.

Forty-four e-book platforms were tested, including 275 e-books from 65 publishers. The results are published on the E-book Audit 2016 website. There is a spreadsheet which automatically calculates a percentage score based on each platform’s performance against the audit questions. The ‘Results’ spreadsheet also displays the average maximum and minimum potential scores a platform could achieve if all questions were answered for its highest and lowest performing formats. There can be substantial differences in score between different formats of the same book. This begs the question: how many libraries provide recommendations as to the best formats on a given platform?

The spreadsheet also contains two different interactive dashboards which allow users to assign different importance weightings to accessibility criteria using slider bars. As a result of these weightings, lists are generated which rank platforms and publishers accordingly. In addition, to facilitate working with e-book providers to improve accessibility (which was one of the key aims of the project), individual platform feedback reports were created and made available online. The table on page six of these reports gives an overview of how the platform performed for the different criteria.

Final staff survey

Once the audit was completed, the project team felt it was important to capture the staff development that resulted from it, and to identify and address any areas where those involved would benefit from additional training and support. In order to achieve this, a survey was produced and circulated to those involved in the audit. ‘Question 1’ asked whether this project was the participant’s first experience of auditing e-book accessibility, and the subsequent questions differed depending on the answer given.

Survey results

First-time testers

The survey received 45 responses. For 71.1% of participants (32), this was their first experience of auditing e-book accessibility. These participants went on to answer questions in ‘Section 2 – ‘first time’ accessibility testers’.

The results (shown in Figure 2) indicated that 93.7% ‘first timers’ (30 of 32) agreed or slightly agreed that they felt more empathy for their disabled learners as a result of taking part in the project. Of those, 90.6% (29 of 32) agreed or slightly agreed that they felt better equipped to advise their disabled learners and 96.9% (31 of 32) agreed or slightly agreed that e-book suppliers should provide better accessibility information. Other responses indicated that 93.7% (30 of 32) agreed or slightly agreed that accessibility should influence library procurement of e-books, 71.9% (23 of 32) agreed or slightly agreed that e-book accessibility should influence reading list recommendations, 96.9% (31 of 32) agreed or slightly agreed that the e-book accessibility audit was useful for staff development, 87.5% (28 of 32) agreed or slightly agreed that the supporting resources were helpful and 87.5% (28 of 32) agreed or slightly agreed that they would be interested in further training if it was available.

Figure 2 

Survey results from first-time testers. (See text for details)

There were high levels of agreement on the very different criteria but the area with less agreement was related to procurement. Procurement decisions are complex and multilayered so it is possible some respondents worried that a rigid approach to accessibility might deny them access to core titles.

Experienced testers

Participants who were already experienced at e-book accessibility were directed to the questions in ‘Section 3 – experienced accessibility testers’. Here the main questions were directed at whether their previous perceptions were changed because of taking part. Little change was expected since they were already very aware but the responses were more positive than anticipated, suggesting even this group had perceived some ‘distance travelled’ in terms of their awareness and expectation.

The results indicated that in relation to empathy for their disabled learners, 61.5% of participants (8 of 13) felt more strongly about accessibility as a result of taking part in the project. 76.9% of them (10 of 13) felt better equipped to advise their disabled learners. 69.2% (9 of 13) felt more strongly that e-book suppliers should provide better accessibility information. 69.2% (9 of 13) felt more strongly that accessibility should influence library procurement of e-books.

The lowest response was 46.2% (6 of 13) who felt more strongly that e-book accessibility should influence reading list recommendations. The remainder ‘felt just as strongly as before’. 92.3% (12 of 13) agreed or slightly agreed that the e-book accessibility audit was useful for staff development. 84.6% (11 of 13) agreed or slightly agreed that the supporting resources were helpful. 92.3% (12 of 13) agreed or slightly agreed that they would be interested in further training if it was available.

Training suggestions

All participants were also invited to make suggestions for further training. Some responses focused on generic awareness of disability issues including:

  • general understanding of disabled students’ experience of accessing e-books
  • specific issues that disabled people encounter when using e-books
  • attempting to access e-books using impairment simulation tools (e.g. blindfolds or glasses with blurred lenses).

Some suggestions focused on e-books and platforms such as:

  • training on using the supporting materials
  • accessing e-books on mobile devices
  • variations between platforms
  • specific guidance on accessing e-books from a particular provider
  • different file types and file structures, including identifying file structures that result in an accessible document and how these can be improved to increase accessibility.

Another theme was training on assistive software. This included accessibility tools available for Mac OS and iOS devices, including free and inbuilt tools, as well as using assistive software and the accessibility features in software such as Adobe Digital Editions. Training on assistive technology for different disabilities was also suggested (e.g. tools for dyslexic users and tools for partially sighted users).

Finally, additional training was suggested on using screen-reading and text-to-speech tools with e-books, including free screen-reading tools and how different file types can be accessed with text-to-speech tools. This topic stood out as a key theme, mentioned in half of the comments received (8 of 16). This supports the earlier findings of the project team that there is a need for additional support and development amongst librarians in UK HE around using screen-reading and text-to-speech tools.

E-book audit implications for e-book suppliers and HE and FE providers

It is important to note that creating an accessible end-user e-book experience is complex and affected by multiple factors. Thus, oversimplification in the use of the survey tool and interpretation of the audit results should be avoided, as the measures used in the audit capture only part of the accessibility picture. With this in mind, some suggestions for different groups on using the audit results follow.

E-book suppliers

Suppliers are encouraged to look at their individual platform feedback report and use this to guide improvements to accessibility. Although the audit was designed to provide a snapshot of accessibility at the time it was conducted, the audit questionnaire continues to be available online and when suppliers make changes to their platforms they are encouraged to re-audit their updated platform. The E-book Accessibility Audit project team will then update the results spreadsheet and the platform feedback reports with the new data.

Collections management and e-book procurement

It is suggested that those involved in e-book procurement use the individual platform feedback report for platforms they are subscribing to and consider the results of the audit when negotiating with e-book providers, using the audit as a framework to start conversations about accessibility. This should be done for both new and existing subscriptions. Raising awareness with suppliers is critical to effecting change.

Academics involved in compiling reading lists

It would be beneficial for institutions to work towards a joined-up approach where academics are alerted to the more accessible platforms and publishers. Where there is flexibility on reading lists it is better for students (and budgets) to point to more accessible providers. The individual platform feedback reports, in particular the ‘score breakdown’ overview tables on page six of each report, can raise awareness on the elements that constitute accessibility and the variability in accessibility amongst different platforms. It is challenging to seek to have platform accessibility influencing resource selections but there is certainly an argument for raising awareness and questioning suppliers.

Technologists

The flow diagram shown in Figure 1 above indicates accessibility in the e-book supply line so raising awareness amongst learning technologists and IT infrastructure teams is critical to ensuring that local installations of key software (Adobe Digital Editions/Adobe Reader and text-to-speech tools) are up to date and installed across all networks. Devenney et al. recommend these are installed at all Northern Collaboration universities to improve the student experience.

Library and disability staff

It is recommended that library and disability staff refer to the individual platform feedback reports when supporting students in order to advise users on the features they can expect to find within each e-book platform. It would also be useful to show students the interactive dashboards on the scoring spreadsheet so they can select the accessibility features that are high priority for them personally and find out the best platforms to use to meet those requirements.

Conclusion

The e-book accessibility audit has been a valuable and informative exercise. Significant variability in accessibility has been revealed between different e-book platforms, which could be due to a variety of factors. Individual feedback reports have been produced for widely used e-book platforms, making their accessibility in different areas more transparent and providing a quick and easy overview for staff to refer to when supporting students. The audit questionnaire represents a free tool which can be used to assess key elements of e-book accessibility. There are plans to run an updated survey in 2018.

Awareness of e-book accessibility has been raised across a large section of the UK HE sector. As a result of taking part in the audit, participants reported feeling more empathy for disabled learners and better equipped to advise them. A need has been identified for further training on various topics, in particular using e-books with screen-reading and text-to-speech software.

The majority of participants felt that e-book suppliers should provide better accessibility information and that accessibility should influence e-book procurement and reading list recommendations.

It is hoped that the e-book audit will act as a framework to support conversations between practitioners and providers which will result in improvements in accessibility that benefit all students across the sector. Making things accessible for disabled students improves usability for everybody.