Many routing application providers can give a detailed route from start point to destination— except for any non-motorized, pedestrian leg of a journey! The OpenSidewalks project addresses the fact that most cities lack open, consistently formatted, readily available, sidewalk data that supports routing for non-motor travel. As a result of the data crowdsourced by the OpenSidewalks project, we can furnish applications like AccessMap (found at accessmap.io).Learn more
Accessmap.io provides customized accessible sidewalk and footpath routing directions based on your personal ability profile. This can benefit anyone, but is particularly designed to address the informational needs of people with mobility limitations.Learn more
Universal Collaborative Play
Play is for everyone! The Universal Playgroup creates opportunities for team play and access to open-source inclusive design for all. The Universal Playgroup started as a re-imagining of a work and play space that is intended for all abilities. An all-abilities space was crucial because the physical play space directly impacts how quickly and efficiently participants of all abilities can immerse in team-play. The Universal Playgroup team consists of experienced educators, students and researchers. We design play scenarios that integrate technology in ambient ways to allow for the environment to infer context and participant intentions. The goal is to promote team work and elevate people, not assistive technology, in the interaction. Our play technology is designed to support everyone’s participation rather than isolate users from the team experience.
Tactile MapTile employs a unique tactile based representation of various features in a map to enhance the spatial understanding for people with broad visual capacities. Each feature is represented by a differet texture and pattern tactile design. Users can generate and customize a 3D map model based on their choice of location, and which features to include.
Autonomous Wheelchair Kit
The Autonomous Wheelchair project at the Taskar Center for Accessible Technology focuses on the addition of environmental sensors and real-time, on-board machine intelligence to enhance powered mobility options that are already on the market. We are designing a smart and connected power wheelchair add-on kit that would support both autonomous personal mobility as well as shared control (aka shared autonomy) for people with mobility limitations. The kit is manufacturer-independent and would connect to any powered mobility device via a standard joystick controller to deliver new and improved capabilities. The greater vision for the project is not only to address some common safety tasks (collision avoidance, extreme humidity and temperature alerts) and navigation tasks (e.g., given a destination, identify a wheelchair accessible task in a pedestrian environment, or follow a leader in a pedestrian environment), but also study the dynamic social roles that mobility devices play in group interactions (including the perceptual and interaction challenges that would make an assistive mobility robot capable of engaging in a group interaction).
iDesign Lab engages a community of practice in open source technology design, build, and use. We engage communities of practice in collaborative design aiming to increase access to everyday technologies through community ideation and development of ability-focused novel interfaces, the addition of sensors and input devices to existing technologies, and large civic engagement projects. Co-designing empowers everyone. The iDesign Lab is a setting where people of all abilities, caregivers and therapists learn to modify, design and create customized assistive devices and cultivates a 'maker' culture among participants. We form multidisciplinary teams to identify access barriers individuals face in their everyday life, and respond to these inequities by co-designing innovative, low-cost access solutions. Our students take our designed-for-one solutions and generalize designs, promoting inclusive design on a broader scale.
Apple products use the same gesture vocabulary for input. Certainly, unifying gestures across devices helps their adoption. However, in the context of a population with hugely variable abilities, achieving broad or general impact requires the adaptation or customization of input gestures for each individual. As a first step, we aim to make touchscreens truly accessible to populations with involuntary motion, such as Cerebral Palsy patients or elderly individuals with essential tremors. We are building a system that learns and recognizes both intended and unintended multi-touch gestures. By recognizing the latter, we can customize accessibility for individuals exhibiting involuntary movements.
The TACcess DIY Switch Accessibility Kit
The TACcess Kit is designed as an easy-to-use starter kit for a wide range of accessibility projects focusing on the home as a sandbox for interactive switch-accessible play. Kit recipients can use the TACcess Kit to turn many battery-operated toys into switch-adapted toys. They can learn how to create customized interactive surfaces by turning almost any material into a sensor. The kit assumes no knowledge of circuits or electronics. Intended users are design students, caregivers of individuals with limited mobility and motivated high school students.
Next Gen Augmentative Communication
Augmentative communication devices are traditionally speech generating devices through which the actor uses input devices to construct speech. These devices are typically devoid of any context related to a user's position, geographical location, scene understanding, etc. With the recent additions of cameras, global positioning sensors and fast computing power in mobile devices, we can make the augmentative communication experience more immersive while removing one level of friction between user and machine. This work will result in a new paradigm for people with speech disabilities to interact with augmentative speech devices. We are developing an innovative tool to enhance communication devices by immersing the device in the speaker's environment, resulting in context-informed augmentative communication.