Skip to content

semantically meaningful notebooks¤

this is the first presentation of the nbconvert-a11y template for assistive reading experiences in notebooks. these templates look beyond web content accessibility guidelines to construct idealized experiences for assistive technology.

  1. perceivable notebooks
  2. operable notebooks
  3. notebook demo
  4. robust notebooks

TLDR¤

  • EVERYONE, including assistive technology users, should have accessible developer experiences
  • HTML is our matter for accessible experiences
  • WCAG is anachronistic for interactive computing education
  • reference implementations (eg ARIA practice guides) provide valuable calibrations for more complex applications

beyond wcag¤

* section 508 amendent of the rehabiliation act of 1973 is drastically insufficient for the needs of interactive computing technology. * WCAG 2.0 was published in Dec 2008 long before the demanding needs of modern javascript (eg Modern Health, frameworks, performance, and harm ) * WCAG 2.2, adopted in October 2023, is a more equitable goal with clearer specifications for implementers and testers than the prior versions. satisfy AAA and allow the ability to downgrade when access needs are less important. * federal compliance goals actively neglect the needs of assistive technology users in modern learning environments and creates a two-tier system.

what if web content accessibility guidelines cared about visualization and data?¤

Chartability is a set of heuristics for ensuring that data visualizations, systems, and interfaces are accessible.
the first four principles extend from web content accessibility guidelines. the last three principles are the additional principles for accessibile data experiences.
  • perceivable User must be able to easily identify content using their senses: sight, sound, and touch.
  • operable All controls must be error-tolerant, discoverable, and multi-modal (not just mouse operable, but using keyboard, etc).
  • understandable Any information or data are presented without ambiguity, with clarity, and in a way that minimizes cognitive load.
  • robust The design is compliant with existing standards and works with the user’s compliant, assistive technologies of choice.
  • compromising (Understandable, yet Robust): Information flows must provide transparency, tolerance, and consideration for different ways that users with assistive technologies and disabilities will prefer to consume different information.
  • assistive (Understandable and Perceivable but labor-reducing): Interface must be intelligent and multi-sensory in a way that reduces the cognitive and functional labor required for use.
  • flexible (Perceivable and Operable, yet Robust): Design must respect user settings from user agents (browsers, operating systems, applications) and provide presentation and operation control.

read more about the relationship between accessible computational notebooks and other web accessibility initiatives on the nbconvert-a11y work in progress wiki.

%%html
<!-- styling for the presentation-->
<style>
.barb li {
    list-style: none;
    a {font-size: 400%;}
    q {display: block; font-size: 150%; text-transform: lowercase;}
    background-color: var(--bg-color, var(--jp-layout-color0
                                          , white));
    font-weight: 1000;
    &:nth-child(2n+1) {filter: invert(100%);}
    &:nth-child(4) ~ li {text-transform: uppercase;}
    a:visited, a:link {color: unset; }   
}
</style>

designing assistive interfaces for POUR-CAF outcomes¤

accessibility is the floor. our perceivable designs consider all the sensory modalities that are possible with hypermedia. we'll find that correctly choosing semantics from the html standard has superior accessibility outcomes.

  • how is the story percievable?
    • what does the story feel like?
    • what does the story sound like?
    • what does the story look like?
  • how operable and understandable is your opinion?
  • how robust is your approach?
    • automated testing
      • did you validate the html data structure?
      • did you check the accessibility tree?
    • user testing
      • did you test with at least one screen reader?
      • how you consulted affected communities?

demo: what does the story feel like?¤

> there is no such thing as a static document in HTML.

a demonstration of standard, low vision, and screen navigation of the nbconvert-a11y template.

  • tab stops

a snapshot of the tab stops in a template starting from the url

> non-standard tabbing events are plaguing jupyter keyboard accessibility. see tab traps pr

in real interactive applications, technincal debt makes it hard to determine the accessible notebook experience in modern applications. they are not designed to be accessible so they are bad references for our goal. reference templates give us the ability to test expectations and implement our applications accordingly.

what does the story sound like?¤

we'll listen different aspects of screen reader navigation.

  • the gestalt of a notebook
    • at a glance, sighted visitors can assess overall qualities of notebooks. what are some?
  • cell navigation, we can't know what is best for generic content
  • aria live to indicate changes
  • the reference template provided a valuable test bed for aria live best practices to influence jupyterlab

what does the story look like?¤

  • reader mode color and text settings

    • nbconvert a11y settings juxtaposed next to firefox reader mode
  • notebook summary, table of contents, and activity log

    • the notebook summary and table of contents buttons
    • the user facing activity log

why is the urge believe sight so strong?¤

do we try to cram too much information into our narrow visual channels? what happens if we express more of the electromagnetic spectrum?

http://www.eklhad.net/philosophy.html

did you test?¤

No Table (Critical)¤

No Table (Critical)

html is our matter.
  • we acheive superior assistive technology experiences for computational works by expressing more of the html5 vocabulary to capture notebooks semantics.

    > If you can use a native HTML element [HTML51] or attribute with the semantics and behavior you require already built in, instead of re-purposing an element and adding an ARIA role, state or property to make it accessible, then do so. > > https://www.w3.org/TR/using-aria/#firstrule

    > Do not change native semantics, unless you really have to. > > https://www.w3.org/TR/using-aria/#secondrule

  • it provides the accessible substrate for reading notebooks through progressive enhancement

  • the table structure is flexible for notebook documents because they are stored as structured data

the document object model for the notebook as a table

history of html tables web typography tables

reducing the distance between source and html. accessibility can only be evaluated in html!

open questions¤

https://github.com/deathbeds/nbconvert-a11y/wiki/accessible-computational-notebooks#accessibility-standards-and-recommendations

%%html
<h3><a href="https://science.nasa.gov/mission/hubble/multimedia/sonifications/">data sonification - westerlund 2</a></h3>
<blockquote>
This is a cluster of young stars  about one to two million years old  located about 20,000 light-years from Earth. In its visual image form, data from Hubble (green and blue) reveals thick clouds where stars are forming, while X-rays seen by Chandra (purple) penetrate through that haze. In the sonified version of this data, sounds sweep from left to right across the field of view with brighter light producing louder sound. The pitch of the notes indicates the vertical position of the sources in the image with the higher pitches towards the top of the image. The Hubble data is played by strings, either plucked for individual stars or bowed for diffuse clouds. Chandras X-ray data is represented by bells, and the more diffuse X-ray light is played by more sustained tones.
</blockquote>
<iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen="" frameborder="0" height="473" src="https://www.youtube.com/embed/ESz8Cvirh00" title="Data Sonification: Westerlund 2 (Multiwavelength)" width="840"></iframe>

data sonification - westerlund 2

This is a cluster of young stars – about one to two million years old – located about 20,000 light-years from Earth. In its visual image form, data from Hubble (green and blue) reveals thick clouds where stars are forming, while X-rays seen by Chandra (purple) penetrate through that haze. In the sonified version of this data, sounds sweep from left to right across the field of view with brighter light producing louder sound. The pitch of the notes indicates the vertical position of the sources in the image with the higher pitches towards the top of the image. The Hubble data is played by strings, either plucked for individual stars or bowed for diffuse clouds. Chandra’s X-ray data is represented by bells, and the more diffuse X-ray light is played by more sustained tones.