Timeline: Ongoing Role: Usability and Inclusive Design Consultant

Product: Wearable Haptic Navigation Device and paired iOS Application

I collaborated with the WearWorks founders to ensure that their product delighted the Blind and Visually Impaired user base with a tailored and thoughtful Voice Over experience.

Wearworks believes that everyone deserves to live an independent life. Through their accessories and haptic language, they empower blind and visually impaired users to safely and intuitively navigate to any destination.  

 
Original Wayband Beta Release
 
 

The product

Their first product was a wearable haptic navigation device for the blind and visually impaired. It guides users to an end-destination using only vibration.

In designing this innovative product they utilized feedback from the BVI community, developers and non-bvi users for each part of the industrial design and app design processes.

Blind Marathon runner Simon Wheatcroft and Wearworks founder Kevin Yoo run the 2017 NYC Marathon with the wayband.

Blind Marathon runner Simon Wheatcroft and Wearworks founder Kevin Yoo run the 2017 NYC Marathon with the wayband.

Press:

  • https://www.nbcsports.com/video/blind-runner-simon-wheatcroft-runs-solo-nyc-marathon

  • https://www.theverge.com/2017/11/6/16610728/2017-new-york-marathon-blind-runner-wearworks-wayband-simon-wheatcroft

 

Wearworks: 2019

User Experience for the Wayband App

Pairing wearworks’ wearable navigation device with the wayband app creates additional possibilities for independent navigation. After wearworks worked with a 3rd party company to come up with the UI elements, I stepped in to work on the voice over experience for BVI users.

waybandios.png
 
 

Ensuring Accessibility for the Blind User

Voice Over for the Blind, Independent Navigator

Pairing wearworks’ wearable navigation device with the wayband app creates additional possibilities for independent navigation. After wearworks worked with a 3rd party company to create UI elements, screen flows and copy, I stepped in to work on a more detailed voice over experience for blind Users. From research, I know that visually impaired users don’t often use the voice over experience exclusively. From here, I knew to narrow and intensify my focus on blind users and their unique requirements for positive experiences.

smartphone copy.png
Gesture library.png

(From Apple) To audit your app with VoiceOver on, you'll use VoiceOver's unique set of gestures to navigate your app. For testing, there are five key gestures that you may use:

  • Swipe left or right to navigate to the next or previous UI element.

  • One-finger double-tap to activate the selected element.

  • Two-finger tap to stop and resume speaking.

  • Swipe up with two fingers to read everything onscreen.

  • Three-finger triple-tap to turn screen curtain on and off.

 

But intuitive gestures for the blind user go so far past these simple first 5:

Navigate and read

  • Tap: Selects and speak the item.

  • Swipe right or left: Selects the next or previous item.

  • Swipe up or down: Depends on the rotor setting. See Use the VoiceOver rotor.

  • Two-finger swipe up: Reads all from the top of the screen.

  • Two-finger swipe down: Reads all from the current position.

  • Two-finger tap: Stops or resumes speaking.

  • Two-finger scrub (move two fingers back and forth three times quickly, making a “z”): Dismisses an alert or returns to the previous screen.

  • Three-finger swipe up or down: Scrolls one page at a time.

  • Three-finger swipe right or left: Goes to the next or previous page (on the Home screen, for example).

  • Three-finger tap: Speaks additional information, such as position within a list or whether text is selected.

  • Four-finger tap at top of screen: Selects the first item on the page.

Four-finger tap at bottom of screen: Selects the last item on the page.

Activate

  • Double-tap: Activates the selected item.

  • Triple-tap: Double-taps an item.

  • Split-tap: An alternative to selecting an item and double-tapping to activate it, touch an item with one finger, then tap the screen with another.

  • Double-tap and hold (1 second) + standard gesture: Use a standard gesture. The double-tap and hold gesture tells iPhone to interpret the next gesture as standard. For example, you can double-tap and hold your finger on the screen until you hear three rising tones, and then without lifting your finger, drag your finger on a slider.

  • Two-finger double-tap: Initiates an action or halts or pauses an action in progress. For example, you can:

    • Answer or end a call.

    • Play or pause in Music, Videos, Voice Memos, or Photos (slideshows).

    • Take a photo in Camera.

    • Start or pause recording in Camera or Voice Memos.

    • Start or stop the stopwatch.

  • Two-finger double-tap and hold: Changes an item’s label to make it easier to find.

  • Two-finger triple-tap: Opens the Item Chooser.

  • Three-finger double-tap: Mutes or unmutes VoiceOver. If both VoiceOver and Zoom are enabled, use the three-finger triple-tap gesture.

Three-finger triple-tap: Turns the screen curtain on or off. If both VoiceOver and Zoom are enabled, use the three-finger quadruple-tap gesture.

Creating VO Annotations

Annotating Wireframes for AccessibilityElements

A great resource to help guide me was this medium article. I wanted to communicate Voice Over Copy and flow to the development team that made for an intuitive and logical experience where the blind user never got lost on our binding screen format built for sighted users.

Here is an example of detailed annotations to communicate VO accessibility traits, specific VO copy / additional context notes and grouped elements.

Portfolio Example Annotation.png
 
 

Planning Usability Testing for All

(Ongoing project TBU)