- Authoring Accessible Content
These guidelines are intended to help members of the University community create applications, websites and digital content that are accessible to people with disabilities in a way that they can perceive, operate, understand, and fully enjoy them.
Princeton’s digital accessibility policy and practices are guided by the success criteria of the Web Content Accessibility Guidelines 2.0 Level AA (WCAG 2.0 AA), as developed and maintained by the World Wide Web Consortium (W3C). To provide accessible digital content, in accordance with the WCAG 2.0 AA criteria, the University recommends testing and remediation be conducted on newly-developed or deployed applications, websites or digital content as described below.
Website owners are expected to make their content accessible to people who may have visual, auditory or reading disabilities, or who rely on assistive devices to perceive and understand the content.
Authors will need to provide text alternatives for images and structuring elements (headings, lists, table headers) for written content. These considerations, as well as additional considerations regarding the use of color and screen reader compatibility are explored in detail by the 11 Key Accessibility Factors:
- Alternative text describes each image's meaning in context.
- Headings are formatted as H1/H2/H3 elements, not just big bold text.
- Lists are formatted as lists, not just symbols and numbers.
- Tables have real header cells, not just background colors.
- Color contrast is strong enough for users with low vision or colorblindness.
- Meaningful links are self-explanatory even out of context (unlike "click here").
- Identify languages for screen readers: "Español" without a language tag is "A Spaniel."
- Avoid using images of text.
- Avoid using layout tables as fake columns.
- Avoid using sensory characteristics that disappear with layout or color perception changes ("the red items in the right-hand column").
- Avoid using color alone to provide meaning.
As covered in detail in the Video Accessibility Guidelines, videos or audio that will be broadly used need accurate captions or transcripts.
Determining whether websites and applications meet the WCAG 2.0 AA criteria and are compatible with common assistive devices requires hands-on testing. Quality assurance testing should be conducted throughout the development and content creation lifecycle: reviewing wireframes and prototypes for potential pitfalls, auditing content and code prior to release, and spot-checking new content and features in production.
Content owners can follow the Test for Accessibility guide to perform automated, keyboard and basic screen reader tests, but department and program websites and applications should be reviewed by an accessibility professional before launch.
Automated tests can discover simple issues such as poor color contrast in text, missing alternative text on images and tables without headings. It cannot tell whether the font is legible, the alternative text is accurate, or the table headings make sense. Automated tests are an excellent time saver for proofreading and should be run regularly, but do not replace manual testing.
Keyboard & Mobile Testing
Content should be accessible without reliance on a mouse. You can conduct keyboard access tests yourself by using the Tab, Escape, Return/Enter, and arrow keys, as well as the space bar, on any keyboard. You should be able to reach and operate all interactive elements, and no elements should trap the cursor without an obvious way to exit or close the element.
Keyboard testing should also translate to accessible mobile experiences. Content should reflow without horizontal scrollbars, and users should be able to tap with one finger to operate all controls, even if other methods (swipe, shake, pinch, etc) are available.
Additional Information on Keyboard Testing:
Basic Screen Reader Testing
Screen reading software audibly describes text content and text alternatives for visual content, and provides keyboard, gesture or dictation-based interaction methods for users who may not be able to see a mouse cursor.
Testing using actual desktop and mobile screen readers is important, as they have specific and unique labeling requirements regarding each element's role (i.e., button, link, heading) and state (i.e., collapsed or expanded), require specifying whether changes and alerts on the page should be verbally announced, and require that interactive elements are coded in ways that the assistive device can click and navigate them.
Sighted and non-sighted users generally do not use screen readers in the same way, and not all screen readers work the same. Self-testing with a screen reader should precede, not replace, professional testing.
Additional Information on Screen Reader Testing:
Websites and applications should be reviewed before launch, or when developing new and complicated features.
For internally-developed products, request a consultation with the University's Digital Accessibility Developer to schedule testing. For products being purchased from a vendor, documentation should be requested that their product has been tested for usability as per the IT Procurement Accessibility Guidelines.