‘I pay £9,000 a year tuition fees and two thirds of the books on my reading list are unsuitable for my access needs. Can I have a refund?’
It’s a good question. Around ten per cent of students in higher education (HE) have a print impairment.1 They are unable to access printed text, commonly due to a visual or physical impairment, or a specific learning difficulty such as dyslexia. They can experience significant barriers when accessing information for their studies and this can negatively impact attainment.
In 2016 a group of library/disability professionals from across UK HE, Jisc and representatives from e-book suppliers collaborated on a crowdsourced audit of e-book accessibility. The focus was on e-books for the UK education sector rather than e-books for mainstream commercial consumption.
The group formed through LIS-ACCESSIBILITY, a JISCMail list for sharing best practice around supporting disabled students. Through discussions on the list, a need was identified for a practical tool to benchmark e-book accessibility.
Well-designed and inclusive e-books can address the barriers to information experienced by students with print impairments. Electronic formats should be adaptable to the specific needs of individual students. Unfortunately, this is not always the case and many students find that e-books are inaccessible to them. For example, many students with dyslexia need to change the background and font colour; people with visual impairments may need to zoom text to a high level and for the text to reflow (i.e. automatically adjust to fit the page at the new zoom level). These features are not consistently available across all platforms.
Previous work has investigated the accessibility of e-books provided to the UK HE sector. For example, the Open University Library4 conduct accessibility checking of the online resources they subscribe to and provide accessibility information to users via the library website, as well as providing feedback to publishers.
In 2010 Jisc TechDis, Jisc Collections and the Publishers Licensing Society sponsored a small-scale project5 to explore e-book platform accessibility and provide guidance for publishers and aggregators. Participants completed a series of tests designed to replicate typical user activities using different access technologies. The research showed wide variability in the ease with which testers could complete each task, depending on the access technology and platform they were using. Various recommendations were made to providers on how to improve the user experience, and the authors identified ‘a need to raise awareness of accessibility issues; increase the capacity to assess the accessibility of e-book platforms and to support the publishing industry in developing expertise in the accessibility of e-books’.
Jisc TechDis subsequently carried out further research6 with a grant from CLAUD. This involved a survey of HE library requests for resources in alternative formats, one aspect of which covered the accessibility of participants’ e-book systems. Key findings were uncertainty amongst respondents about the accessibility of their e-book platforms and partial accessibility of most platforms. The authors highlight ‘a responsibility for library staff to be informed about the products they procure and promote to learners – specifically about the basics of accessibility like font size/colour/reflow, keyboard access, text to speech and screen-reader access’.
In 2014 the University of Bradford conducted a systematic accessibility audit of their most used electronic resources,7 including e-book platforms. It was found that text reflow did not work in the majority of cases. Another issue identified was the production of PDFs as images rather than scanned text, meaning that many accessibility features do not work. Furthermore, many of the resources tested lacked an inbuilt tool for reading text aloud. The authors outline a duty for libraries to raise publishers’ awareness of accessibility issues with their products. They also urge others to repeat and build on the project, to open the conversation on accessibility with suppliers, and for libraries and publishers to work together to address accessibility needs.
Also in 2014, outside the UK, San Jose State University conducted an E-book Accessibility Project8 evaluating the accessibility of 16 major academic e-book platforms. The project found that the platforms tested lacked accessibility features offered by commercial e-book providers. The authors encourage librarians to ‘let publishers know that accessibility is a major consideration in their e-book adoptions and urge their compliance with common accessibility standards’.
More recently, a 2015 Northern Collaboration study9 in the UK looked at the experience of student use of e-books on mobile devices. The study used a four-part matrix, examining the licence, platform, mobile interface and accessibility for five key e-book suppliers. There is considerable overlap between mobile accessibility and accessibility for disabled users, as acknowledged by Jisc TechDis:10 ‘The clean, flexible, consistent interfaces that support high levels of disabled access are very similar to those required for use on a mobile browser.’ A key finding was that online e-book readers require third-party software which is not installed as standard on all universities’ PCs. The study also found that it is not only e-book platforms and files that constrain usability and accessibility, but also the limitations of the software or app (chosen by the supplier) used to access downloaded e-books. For example, four of the five providers use Adobe Digital Editions, which has very limited functionality, to open e-books downloaded as PDF on a PC. The authors of the study report note that e-book accessibility is particularly pertinent given recent changes to Disabled Students’ Allowance funding, and make a number of recommendations, including further testing of e-book accessibility and sharing the findings of the paper with suppliers ‘to encourage dialogue within the whole community in order to improve the user experience’.11
Accessibility guidelines for web content are already in existence.12 However, these are beyond the technical understanding of most stakeholders: like those responsible for procuring e-books, creating reading lists or supporting students with e-books. Further, technical accessibility standards may not always translate to an accessible end-user experience.
A need was identified for a tool to assess the accessibility of e-books which could:
The project team considered using Web2Access,13 a resource which offers users the ability to view, create and submit accessibility reviews of Web 2.0 services. However, it was decided that a tool developed specifically for e-books with non-experts in mind would be more appropriate.
Key to the success of the project was a framework for librarians and providers to discuss accessibility in a demonstrable way. ‘We are W3C compliant’ is easy for a supplier to claim and hard for a purchaser to argue. By developing easily demonstrated criteria, purchasers are empowered. For librarians to engage with this framework, they need to be able to empathize with disabled learners by understanding what makes an accessible end-user experience.
To ensure maximum engagement, shared workload and scalability, the project took a crowdsourcing approach, with the aim of involving as many people as possible from across the sector.
The LIS-ACCESSIBILITY mailing list was central to building this community. It was used to identify the most widely used e-book platforms in order to decide which platforms to focus on in the audit. In addition to drawing on the criteria used in previous studies, LIS-ACCESSIBILITY was also used to elicit suggestions on questions to include in the audit tool14 and to gather feedback to refine the questions, as well as volunteers for conducting the audits.
The initial project team also joined forces with the National Consortia for Monographs e-books sub group, which, following the recent Northern Collaboration study,15 was also looking to audit e-book platforms for accessibility, with a view to informing procurement decisions. People volunteering to take part in the audit came from 33 institutions, representing 20% of the sector, along with five from e-book providers/suppliers.
To introduce key accessibility concepts to those involved in the audit, a training event was organized for members of the NoWAL (North West Academic Libraries) consortium. This included an overview of the project, key components of accessibility, a demonstration and hands-on practice of how to test e-books against the different audit questions.
To supplement the training event (and support the wider audit), the project team put together comprehensive online guidance for completing the audit questions. This took various forms.
Within the online audit tool16 there was brief guidance on answering each question. A ‘Hints and Tips’ document17 was also developed, containing general guidance on topics such as choosing an e-book to audit, accessing the e-book platform, the level of evaluation required and where to get further help. It also contains tips on answering specific questions, linked to the relevant sections of the audit tool to ensure that support was available at the point of need. Tips include advice on distinguishing the platform interface from e-book content and identifying the different e-book formats available.
More in-depth training materials were also developed for some questions and included in the ‘Hints and Tips’ document. For example, for questions 2H – 2L (‘How do the original colours on the platform interface and e-book content perform against WCAG 2.0 colour contrast success criteria?’), an online video was produced explaining how to test colour contrast using a free tool available online.18 Video demonstrations were also produced to explain how to check for the presence of a ‘skip to content’ or skip navigation’ link on the platform interface,19 and how to check images for alternative text labels.20
Although the audit questions were designed for accessibility non-experts, the project team had failed to anticipate that some volunteers would have no experience at all of using assistive technology to read e-books aloud. Whilst it was suspected that many people would not be familiar with screen-reading technology, it was assumed that everyone would at least have access to text-to-speech software. This was not the case. Some had no access to any suitable software and many of those who did lacked the experience and confidence to use it to complete the relevant audit questions. Thus several audits were completed with missing data for the questions in Section 4: Reading text aloud. The project team filled some of the gaps by completing screen-reader and text-to-speech testing for platforms with missing data so as to avoid skewing results.
However, this issue highlighted an important point: there is a need for further ‘user-education’ across the sector on text-to-speech technology. This is important in the learning experience of many disabled students. Screen reading software is essential for blind students to access digital information, text-to-speech similarly so for many dyslexic students. As people with print impairments comprise a significant proportion of the student population21 and the role of libraries is to facilitate access to knowledge, it would be valuable for this gap in understanding to be addressed. It is especially relevant for frontline staff but acquisitions staff need to take account of it in licensing discussions. Text-to-speech has the potential to benefit all users.
The team were not surprised by the few respondents who completed screen-reader testing, as it is a skill that takes time and training to acquire. A scoring method was devised that adjusted for absent screen-reader scores. However the implication is that most scores are potentially a little more positive than they would otherwise be. It is expected that the screen-reader experience will be poor on many platforms – it requires more expertise to achieve – resulting in a depressed score but, wherever the question was unanswered, the statistical adjustment came into play, artificially raising the score.
The audit generated a wealth of data but not all library staff or disability support staff are equally comfortable with handling spreadsheets of this magnitude. Part of the purpose of the audit was to help staff understand that accessibility is ‘context sensitive’ – what is inaccessible to one learner may be more than adequate for another. The team developed an interactive way to show how accessibility scores could vary depending on the criteria that a learner required.
Forty-four e-book platforms were tested, including 275 e-books from 65 publishers. The results are published on the E-book Audit 2016 website.22 There is a spreadsheet which automatically calculates a percentage score based on each platform’s performance against the audit questions.23 The ‘Results’ spreadsheet also displays the average maximum and minimum potential scores a platform could achieve if all questions were answered for its highest and lowest performing formats. There can be substantial differences in score between different formats of the same book. This begs the question: how many libraries provide recommendations as to the best formats on a given platform?
The spreadsheet also contains two different interactive dashboards which allow users to assign different importance weightings to accessibility criteria using slider bars. As a result of these weightings, lists are generated which rank platforms and publishers accordingly. In addition, to facilitate working with e-book providers to improve accessibility (which was one of the key aims of the project), individual platform feedback reports were created and made available online.24 The table on page six of these reports gives an overview of how the platform performed for the different criteria.
Once the audit was completed, the project team felt it was important to capture the staff development that resulted from it, and to identify and address any areas where those involved would benefit from additional training and support. In order to achieve this, a survey25 was produced and circulated to those involved in the audit. ‘Question 1’ asked whether this project was the participant’s first experience of auditing e-book accessibility, and the subsequent questions differed depending on the answer given.
The survey received 45 responses.26 For 71.1% of participants (32), this was their first experience of auditing e-book accessibility. These participants went on to answer questions in ‘Section 2 – ‘first time’ accessibility testers’.
The results (shown in Figure 2) indicated that 93.7% ‘first timers’ (30 of 32) agreed or slightly agreed that they felt more empathy for their disabled learners as a result of taking part in the project. Of those, 90.6% (29 of 32) agreed or slightly agreed that they felt better equipped to advise their disabled learners and 96.9% (31 of 32) agreed or slightly agreed that e-book suppliers should provide better accessibility information. Other responses indicated that 93.7% (30 of 32) agreed or slightly agreed that accessibility should influence library procurement of e-books, 71.9% (23 of 32) agreed or slightly agreed that e-book accessibility should influence reading list recommendations, 96.9% (31 of 32) agreed or slightly agreed that the e-book accessibility audit was useful for staff development, 87.5% (28 of 32) agreed or slightly agreed that the supporting resources were helpful and 87.5% (28 of 32) agreed or slightly agreed that they would be interested in further training if it was available.
There were high levels of agreement on the very different criteria but the area with less agreement was related to procurement. Procurement decisions are complex and multilayered so it is possible some respondents worried that a rigid approach to accessibility might deny them access to core titles.
Participants who were already experienced at e-book accessibility were directed to the questions in ‘Section 3 – experienced accessibility testers’. Here the main questions were directed at whether their previous perceptions were changed because of taking part. Little change was expected since they were already very aware but the responses were more positive than anticipated, suggesting even this group had perceived some ‘distance travelled’ in terms of their awareness and expectation.
The results indicated that in relation to empathy for their disabled learners, 61.5% of participants (8 of 13) felt more strongly about accessibility as a result of taking part in the project. 76.9% of them (10 of 13) felt better equipped to advise their disabled learners. 69.2% (9 of 13) felt more strongly that e-book suppliers should provide better accessibility information. 69.2% (9 of 13) felt more strongly that accessibility should influence library procurement of e-books.
The lowest response was 46.2% (6 of 13) who felt more strongly that e-book accessibility should influence reading list recommendations. The remainder ‘felt just as strongly as before’. 92.3% (12 of 13) agreed or slightly agreed that the e-book accessibility audit was useful for staff development. 84.6% (11 of 13) agreed or slightly agreed that the supporting resources were helpful. 92.3% (12 of 13) agreed or slightly agreed that they would be interested in further training if it was available.
All participants were also invited to make suggestions for further training. Some responses focused on generic awareness of disability issues including:
Some suggestions focused on e-books and platforms such as:
Another theme was training on assistive software. This included accessibility tools available for Mac OS and iOS devices, including free and inbuilt tools, as well as using assistive software and the accessibility features in software such as Adobe Digital Editions. Training on assistive technology for different disabilities was also suggested (e.g. tools for dyslexic users and tools for partially sighted users).
Finally, additional training was suggested on using screen-reading and text-to-speech tools with e-books, including free screen-reading tools and how different file types can be accessed with text-to-speech tools. This topic stood out as a key theme, mentioned in half of the comments received (8 of 16). This supports the earlier findings of the project team that there is a need for additional support and development amongst librarians in UK HE around using screen-reading and text-to-speech tools.
It is important to note that creating an accessible end-user e-book experience is complex and affected by multiple factors. Thus, oversimplification in the use of the survey tool and interpretation of the audit results should be avoided, as the measures used in the audit capture only part of the accessibility picture. With this in mind, some suggestions for different groups on using the audit results follow.
Suppliers are encouraged to look at their individual platform feedback report27 and use this to guide improvements to accessibility. Although the audit was designed to provide a snapshot of accessibility at the time it was conducted, the audit questionnaire28 continues to be available online and when suppliers make changes to their platforms they are encouraged to re-audit their updated platform. The E-book Accessibility Audit project team will then update the results spreadsheet and the platform feedback reports with the new data.
It is suggested that those involved in e-book procurement use the individual platform feedback report29 for platforms they are subscribing to and consider the results of the audit when negotiating with e-book providers, using the audit as a framework to start conversations about accessibility. This should be done for both new and existing subscriptions. Raising awareness with suppliers is critical to effecting change.
It would be beneficial for institutions to work towards a joined-up approach where academics are alerted to the more accessible platforms and publishers. Where there is flexibility on reading lists it is better for students (and budgets) to point to more accessible providers. The individual platform feedback reports,30 in particular the ‘score breakdown’ overview tables on page six of each report, can raise awareness on the elements that constitute accessibility and the variability in accessibility amongst different platforms. It is challenging to seek to have platform accessibility influencing resource selections but there is certainly an argument for raising awareness and questioning suppliers.
The flow diagram shown in Figure 1 above indicates accessibility in the e-book supply line so raising awareness amongst learning technologists and IT infrastructure teams is critical to ensuring that local installations of key software (Adobe Digital Editions/Adobe Reader and text-to-speech tools) are up to date and installed across all networks. Devenney et al. recommend these are installed at all Northern Collaboration universities to improve the student experience.31
It is recommended that library and disability staff refer to the individual platform feedback reports when supporting students in order to advise users on the features they can expect to find within each e-book platform.32 It would also be useful to show students the interactive dashboards on the scoring spreadsheet33 so they can select the accessibility features that are high priority for them personally and find out the best platforms to use to meet those requirements.
The e-book accessibility audit has been a valuable and informative exercise. Significant variability in accessibility has been revealed between different e-book platforms, which could be due to a variety of factors. Individual feedback reports have been produced for widely used e-book platforms, making their accessibility in different areas more transparent and providing a quick and easy overview for staff to refer to when supporting students. The audit questionnaire represents a free tool which can be used to assess key elements of e-book accessibility. There are plans to run an updated survey in 2018.
Awareness of e-book accessibility has been raised across a large section of the UK HE sector. As a result of taking part in the audit, participants reported feeling more empathy for disabled learners and better equipped to advise them. A need has been identified for further training on various topics, in particular using e-books with screen-reading and text-to-speech software.
The majority of participants felt that e-book suppliers should provide better accessibility information and that accessibility should influence e-book procurement and reading list recommendations.
It is hoped that the e-book audit will act as a framework to support conversations between practitioners and providers which will result in improvements in accessibility that benefit all students across the sector. Making things accessible for disabled students improves usability for everybody.
A list of the abbreviations and acronyms used in this and other Insights articles can be accessed here – click on the URL below and then select the ‘Abbreviations and Acronyms’ link at the top of the page it directs you to: http://www.uksg.org/publications#aa
The authors have declared no competing interests.
Featherstone, L (2015). How can you make resources accessible for those with disabilities In: Jisc blog. July 13 2015 https://www.jisc.ac.uk/blog/how-can-you-make-resources-accessible-for-those-with-disabilities-13-jul-2015 (accessed 26 May 2017).
McNaught, A (2017). The e-book accessibility audit – use and abuse In: Jisc blog. February 7 2017 https://accessibility.jiscinvolve.org/wp/2017/02/07/ebookaudit-useabuse/ (accessed 30 May 2017).
Flow diagram showing potential ‘accessibility attrition’ points. https://accessibility.jiscinvolve.org/wp/files/2017/02/ebookflow.jpg (accessed 21 February 2017).
Mears, W and Clough, H (2015). Online library accessibility support: a case study within the Open University Library. Open Learning 30(1): 73–85, DOI: https://doi.org/10.1080/02680513.2015.1025735 (accessed 9 May 2017).
Jisc TechDis (2010). Towards Accessible e-book platforms. Jisc. https://www.learningapps.co.uk/moodle/xertetoolkits/play.php?template_id=1146#resume=3 (accessed 30 May 2017).
Jisc TechDis and Claud (2013). Libraries and Alternative Formats research. Jisc TechDis. https://www.learningapps.co.uk/moodle/xertetoolkits/play.php?template_id=1146#resume=3 (accessed 19 May 2017).
George, S, Clement, E and Hudson, G (2014). Auditing the accessibility of electronic resources. SCONUL Focus (62): 15–23. https://www.sconul.ac.uk/sites/default/files/documents/6_14.pdf (accessed 5 May 2017).
Mune, C and Agee, A (2016). Are e-books for everyone? An evaluation of academic e-book platforms’ accessibility features. Journal of Electronic Resources Librarianship 28(3): 172–182, DOI: https://doi.org/10.1080/1941126X.2016.1200927 (accessed 19 May 2017).
Devenney, A, Sarjantson, M, Stone, G and Thompson, S (2015). The experience of student use of eBooks on mobile devices. Middlesbrough: Northern Collaboration, DOI: https://doi.org/10.5920/hud.2016.27871 Working Paper (accessed 4 May 2017).
Web Content Accessibility Guidelines (WCAG) Overview. https://www.w3.org/WAI/intro/wcag.php (accessed 30 May 2017).
Welcome to Web2Access. https://web2access.org.uk/ (accessed 30 May 2017).
eBook Accessibility Audit. https://docs.google.com/forms/d/e/1FAIpQLSd3CNbXwhrdFDFYNV_Z2QKoqsghlmcb5So-Yq0fviLHoLJ7pA/viewform (accessed 30 May 2017).
Hints and tips. https://docs.google.com/document/d/1LSo6eubwWTO_ydg3EKIESpG6f-gZrDthsKCH2N41PaE/edit (accessed 15 February 2017).
Colour contrast analyser. https://drive.google.com/file/d/0B7MUgdj5hjfEVGdOUDZRdDViOGM/view (accessed 26 May 2017).
Skip to content Demo. https://drive.google.com/file/d/0B7MUgdj5hjfEc3RwUndSTUF3RzA/view (accessed 27 February 2017).
Alt text Demo. https://drive.google.com/file/d/0B7MUgdj5hjfES1J3TTFSUGs3cTA/view (Accessed 27 February 2017).
E-book Audit 2016. https://sites.google.com/site/ebookaudit2016/home (accessed 20 February 2017).
Interactive scoring – beta liteupload. https://sites.google.com/site/ebookaudit2016/data-and-related-fileds (accessed 27 February 2017).
Individual Platform Feedback Reports. https://sites.google.com/site/ebookaudit2016/reports (accessed 30 May 2017).
E-book audit – how can we help?. https://docs.google.com/a/leedsbeckett.ac.uk/forms/d/10ZF5o0IuknfNoIgDqgphEltnQLgfwUyBwIOFNZMR4cY/viewform?edit_requested=true (accessed 26 May 2017).
E-book audit – how can we help? Responses. https://docs.google.com/forms/d/10ZF5o0IuknfNoIgDqgphEltnQLgfwUyBwIOFNZMR4cY/viewanalytics (accessed 15 February 2017).