skip to main content

@tonyfast s notebooks

site navigation
notebook summary
title
semantically meaningful notebooks
description
this is the first presentation of the nbconvert-a11y template for assistive reading experiences in notebooks. these templates look beyond web content accessibility guidelines to construct idealized experiences for assistive technology.
cells
17 total
2 code
state
executed in order
kernel
Python [conda env:p311] *
language
python
name
conda-env-p311-py
lines of code
175
outputs
2
table of contents
{"kernelspec": {"display_name": "Python [conda env:p311] *", "language": "python", "name": "conda-env-p311-py"}, "language_info": {"codemirror_mode": {"name": "ipython", "version": 3}, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.3"}, "widgets": {"application/vnd.jupyter.widget-state+json": {"state": {}, "version_major": 2, "version_minor": 0}}, "title": "semantically meaningful notebooks", "description": "this is the first presentation of the nbconvert-a11y template for assistive reading experiences in notebooks. these templates look beyond web content accessibility guidelines to construct idealized experiences for assistive technology."}
notebook toolbar
Activate
cell ordering
1

semantically meaningful notebooks

this is the first presentation of the nbconvert-a11y template for assistive reading experiences in notebooks. these templates look beyond web content accessibility guidelines to construct idealized experiences for assistive technology.

  1. perceivable notebooks
  2. operable notebooks
  3. notebook demo
  4. robust notebooks
2

TLDR

  • EVERYONE, including assistive technology users, should have accessible developer experiences
  • HTML is our matter for accessible experiences
  • WCAG is anachronistic for interactive computing education
  • reference implementations (eg ARIA practice guides ) provide valuable calibrations for more complex applications
3

beyond wcag

4

what if web content accessibility guidelines cared about visualization and data?

Chartability is a set of heuristics for ensuring that data visualizations, systems, and interfaces are accessible.
the first four principles extend from web content accessibility guidelines. the last three principles are the additional principles for accessibile data experiences.
  • perceivable User must be able to easily identify content using their senses: sight, sound, and touch.
  • operable All controls must be error-tolerant, discoverable, and multi-modal (not just mouse operable, but using keyboard, etc).
  • understandable Any information or data are presented without ambiguity, with clarity, and in a way that minimizes cognitive load.
  • robust The design is compliant with existing standards and works with the user’s compliant, assistive technologies of choice.
  • compromising (Understandable, yet Robust): Information flows must provide transparency, tolerance, and consideration for different ways that users with assistive technologies and disabilities will prefer to consume different information.
  • assistive (Understandable and Perceivable but labor-reducing): Interface must be intelligent and multi-sensory in a way that reduces the cognitive and functional labor required for use.
  • flexible (Perceivable and Operable, yet Robust): Design must respect user settings from user agents (browsers, operating systems, applications) and provide presentation and operation control.

read more about the relationship between accessible computational notebooks and other web accessibility initiatives on the nbconvert-a11y work in progress wiki.

5 1 outputs.
6

designing assistive interfaces for POUR-CAF outcomes

accessibility is the floor. our perceivable designs consider all the sensory modalities that are possible with hypermedia. we'll find that correctly choosing semantics from the html standard has superior accessibility outcomes.

  • how is the story percievable?
    • what does the story feel like?
    • what does the story sound like?
    • what does the story look like?
  • how operable and understandable is your opinion?
  • how robust is your approach?
    • automated testing
      • did you validate the html data structure?
      • did you check the accessibility tree?
    • user testing
      • did you test with at least one screen reader?
      • how you consulted affected communities?
7

demo: what does the story feel like?

there is no such thing as a static document in HTML.

a demonstration of standard, low vision, and screen navigation of the nbconvert-a11y template.

  • tab stops

a snapshot of the tab stops in a template starting from the url

non-standard tabbing events are plaguing jupyter keyboard accessibility. see tab traps pr

8

in real interactive applications, technincal debt makes it hard to determine the accessible notebook experience in modern applications. they are not designed to be accessible so they are bad references for our goal. reference templates give us the ability to test expectations and implement our applications accordingly.

9

what does the story sound like?

we'll listen different aspects of screen reader navigation.

  • the gestalt of a notebook
    • at a glance, sighted visitors can assess overall qualities of notebooks. what are some?
  • cell navigation, we can't know what is best for generic content
  • aria live to indicate changes
    • the reference template provided a valuable test bed for aria live best practices to influence jupyterlab
10

what does the story look like?

  • reader mode color and text settings

    • nbconvert a11y settings juxtaposed next to firefox reader mode
  • notebook summary, table of contents, and activity log

    • the notebook summary and table of contents buttons
    • the user facing activity log
11

why is the urge believe sight so strong?

do we try to cram too much information into our narrow visual channels? what happens if we express more of the electromagnetic spectrum?

http://www.eklhad.net/philosophy.html

12

did you test?

13

No Table (Critical)

No Table (Critical)

html is our matter. 
  • we acheive superior assistive technology experiences for computational works by expressing more of the html5 vocabulary to capture notebooks semantics.

    If you can use a native HTML element [HTML51] or attribute with the semantics and behavior you require already built in, instead of re-purposing an element and adding an ARIA role, state or property to make it accessible, then do so.

    https://www.w3.org/TR/using-aria/#firstrule

    Do not change native semantics, unless you really have to.

    https://www.w3.org/TR/using-aria/#secondrule

  • it provides the accessible substrate for reading notebooks through progressive enhancement

  • the table structure is flexible for notebook documents because they are stored as structured data

    the document object model for the notebook as a table

history of html tables web typography tables

14

reducing the distance between source and html. accessibility can only be evaluated in html!

15

open questions

  • accessible application
  • accessible input group forms
  • accessible output groups
  • accessible interactive computing
  • accessible debugging
  • accessible collaboration
16

https://github.com/deathbeds/nbconvert-a11y/wiki/accessible-computational-notebooks#accessibility-standards-and-recommendations

17
%%html
<h3><a href="https://science.nasa.gov/mission/hubble/multimedia/sonifications/">data sonification - westerlund 2</a></h3>

<blockquote>
This is a cluster of young stars  about one to two million years old  located about 20,000 light-years from Earth. In its visual image form, data from Hubble (green and blue) reveals thick clouds where stars are forming, while X-rays seen by Chandra (purple) penetrate through that haze. In the sonified version of this data, sounds sweep from left to right across the field of view with brighter light producing louder sound. The pitch of the notes indicates the vertical position of the sources in the image with the higher pitches towards the top of the image. The Hubble data is played by strings, either plucked for individual stars or bowed for diffuse clouds. Chandras X-ray data is represented by bells, and the more diffuse X-ray light is played by more sustained tones.
</blockquote>

<iframe width="840" height="473" src="https://www.youtube.com/embed/ESz8Cvirh00" title="Data Sonification: Westerlund 2 (Multiwavelength)" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
1 outputs.

data sonification - westerlund 2

This is a cluster of young stars – about one to two million years old – located about 20,000 light-years from Earth. In its visual image form, data from Hubble (green and blue) reveals thick clouds where stars are forming, while X-rays seen by Chandra (purple) penetrate through that haze. In the sonified version of this data, sounds sweep from left to right across the field of view with brighter light producing louder sound. The pitch of the notes indicates the vertical position of the sources in the image with the higher pitches towards the top of the image. The Hubble data is played by strings, either plucked for individual stars or bowed for diffuse clouds. Chandra’s X-ray data is represented by bells, and the more diffuse X-ray light is played by more sustained tones.