Skip to content
Open Sidewalks project collects pedestrian infrastructure data


Many routing application providers can give a detailed route from start point to destination— except for any non-motorized, pedestrian leg of a journey! The OpenSidewalks project addresses the fact that most cities lack open, consistently formatted, readily available, sidewalk data that supports routing for non-motor travel. As a result of the data crowdsourced by the OpenSidewalks project, we can furnish applications like AccessMap (found at

Learn more
AccessMap Logo with a view of the app on a mobile phone held by a persons left hand.

AccessMap provides customized accessible sidewalk and footpath routing directions based on your personal ability profile. This can benefit anyone, but is particularly designed to address the informational needs of people with mobility limitations.

Learn more
Digital drawing displaying a dark figure of a man who uses a wheelchair and wearing an XR headset. The background has mountains in a sunset of surreal lavender and orange coloring.


The intent of this work is to allow anyone with their own custom input systems to access Extended Reality(XR) applications without requiring physical movement of head-mounted displays or hand controllers that are standard today. We have implemented FleXR as a prototype to demonstrate the feasibility of standardizing input systems for XR in the form of a plugin for the WebXR framework A-Frame that easy to implement into new and existing applications. We are currently looking for individuals with motor disabilities interested in participating in a user study of the impacts FleXR has when integrated into XR applications. No prior experience with XR devices is required to participate. We are also looking for applications developers with existing A-Frame applications to try integrating FleXR into their work as well as the development of new A-Frame applications, which should follow the specification here (

Universal Play Kiosk

Universal Collaborative Play

Play is for everyone! The Universal Playgroup creates opportunities for team play and access to open-source inclusive design for all. The Universal Playgroup started as a re-imagining of a work and play space that is intended for all abilities. An all-abilities space was crucial because the physical play space directly impacts how quickly and efficiently participants of all abilities can immerse in team-play. The Universal Playgroup team consists of experienced educators, students and researchers. We design play scenarios that integrate technology in ambient ways to allow for the environment to infer context and participant intentions. The goal is to promote team work and elevate people, not assistive technology, in the interaction. Our play technology is designed to support everyone’s participation rather than isolate users from the team experience.

Tactile MapTile employs a unique tactile based representation of various features in a map to enhance the spatial understanding for people with broad visual capacities. Each feature is represented by a differet texture and pattern tactile design. Users can generate and customize a 3D map model based on their choice of location, and which features to include.

Tactile MapTile

Tactile MapTile employs a unique tactile based representation of various features in a map to enhance the spatial understanding for people with broad visual capacities. Each feature is represented by a differet texture and pattern tactile design. Users can generate and customize a 3D map model based on their choice of location, and which features to include.

The smart and connected wheelchair. The image3D model of a power wheelchair with iconography of a wireless wave surrounding it. On the right arm support is an icon of GPS location, on the left arm support are icons of wireless emissions from an on-board tablet and icon of messaging sending out from the chair.

Autonomous Wheelchair Kit

The Autonomous Wheelchair project at the Taskar Center for Accessible Technology focuses on the addition of environmental sensors and real-time, on-board machine intelligence to enhance powered mobility options that are already on the market. We are designing a smart and connected power wheelchair add-on kit that would support both autonomous personal mobility as well as shared control (aka shared autonomy) for people with mobility limitations. The kit is manufacturer-independent and would connect to any powered mobility device via a standard joystick controller to deliver new and improved capabilities. The greater vision for the project is not only to address some common safety tasks (collision avoidance, extreme humidity and temperature alerts) and navigation tasks (e.g., given a destination, identify a wheelchair accessible task in a pedestrian environment, or follow a leader in a pedestrian environment), but also study the dynamic social roles that mobility devices play in group interactions (including the perceptual and interaction challenges that would make an assistive mobility robot capable of engaging in a group interaction).

Community Design for Seattle Design Festival, 2017

iDesign Lab

iDesign Lab engages a community of practice in open source technology design, build, and use. We engage communities of practice in collaborative design aiming to increase access to everyday technologies through community ideation and development of ability-focused novel interfaces, the addition of sensors and input devices to existing technologies, and large civic engagement projects. Co-designing empowers everyone. The iDesign Lab is a setting where people of all abilities, caregivers and therapists learn to modify, design and create customized assistive devices and cultivates a 'maker' culture among participants. We form multidisciplinary teams to identify access barriers individuals face in their everyday life, and respond to these inequities by co-designing innovative, low-cost access solutions. Our students take our designed-for-one solutions and generalize designs, promoting inclusive design on a broader scale.

Gesture Learning

Apple products use the same gesture vocabulary for input. Certainly, unifying gestures across devices helps their adoption. However, in the context of a population with hugely variable abilities, achieving broad or general impact requires the adaptation or customization of input gestures for each individual. As a first step, we aim to make touchscreens truly accessible to populations with involuntary motion, such as Cerebral Palsy patients or elderly individuals with essential tremors. We are building a system that learns and recognizes both intended and unintended multi-touch gestures. By recognizing the latter, we can customize accessibility for individuals exhibiting involuntary movements.

The TACcess DIY Switch Accessibility Kit

The TACcess Kit is designed as an easy-to-use starter kit for a wide range of accessibility projects focusing on the home as a sandbox for interactive switch-accessible play. Kit recipients can use the TACcess Kit to turn many battery-operated toys into switch-adapted toys. They can learn how to create customized interactive surfaces by turning almost any material into a sensor. The kit assumes no knowledge of circuits or electronics. Intended users are design students, caregivers of individuals with limited mobility and motivated high school students.

Next Gen Augmentative Communication

Augmentative communication devices are traditionally speech generating devices through which the actor uses input devices to construct speech. These devices are typically devoid of any context related to a user's position, geographical location, scene understanding, etc. With the recent additions of cameras, global positioning sensors and fast computing power in mobile devices, we can make the augmentative communication experience more immersive while removing one level of friction between user and machine. This work will result in a new paradigm for people with speech disabilities to interact with augmentative speech devices. We are developing an innovative tool to enhance communication devices by immersing the device in the speaker's environment, resulting in context-informed augmentative communication.