Session Descriptions (2017)

This year’s event includes:

  • Cognitive Assistant for the Blind (Keynote) by Chieko Asakawa from Japan
    Computers have been changing the lives of blind people. Synthesized voice helped them access online services. Now, new cognitive computing technologies are reaching the point where computers can help in sensing, recognizing, and understanding the real world for people who are blind and visually impaired.In this talk, Chieko will talk about her research activities related to Information Accessibility that she has been working with at IBM Research. Then, after introducing the concept of Cognitive Assistant for the Blind, she will show us a large-scale indoor navigation system, NavCog, which her team has developed as the first step toward the goal. Lastly, computer vision-based technologies that should help improve the quality of lives for the blind will be covered by showing a variety of demos and videos.
  • Panel: Impact of Artificial Intelligence (AI) on Accessibility Solutions (Dr. Ruoyi Zhou, Moderator)
    Innovations in artificial intelligence, machine learning as well as augmented and virtual reality have opened up new possibilities and solutions to benefit people with disabilities. By enhancing sensory capabilities, creating accessible interfaces, as well as enabling new breakthroughs in areas like transportation, AI has the potential to rapidly offer new and empowering experiences. Join our panel of leading innovators to discuss how AI will shape the future of accessibility solutions.
  • Inclusionary Technology: A Solution for Managing Accommodations of a Mobile Workforce (Lou Orslene and Peter Fay)
    The foundation for developing an inclusive and productive workforce is the accommodation process. The mobility of today’s workforce increases the complexity of managing this process effectively. Recognizing this need, the Center for Disability Inclusion at West Virginia University has partnered with IBM, the Job Accommodation Network, and a number of business and disability organizations to develop a free accommodation case management app for tablets and mobile phones. App development is funded by the National Institute on Disability, Independent Living, and Rehabilitation Research. This session will include a presentation as well as demonstration of the new app to be released in 2017.
  • Panel: Innovations in Affordable Assistive Technology
    This panel will explore innovations in affordable assistive technology and devices for individuals with blindness or low vision. While there have been many advances in the field of assistive technology and devices, the high cost of such innovations keeps many of them outside the reach of the visually impaired.  The focus of this panel will be to highlight innovations which are affordable and can be scaled up to reach large numbers with low cost.
  • Panel: Empowering and Employing the Blind and Visually Impaired (Karen Young, Moderator; Cheryl Cumings; Nicole Ross, Sonal Patel)
    This panel will examine how we can prepare the visually impaired to get relevant skills and good employment opportunities in today’s economy. The subject will be explored from three different perspectives: Educator, Employer ,and Clinician. Topics will include how to address challenges faced by the visually impaired in finding the right skills and jobs, experiences in teaching programming to the visually impaired, the use of MOOC’s to provide high quality, relevant skills and Vision Rehabilitation from a clinical perspective.
  • Panel: Navigating an Inaccessible World: Inclusive Design for Mobile Apps and Websites (Jennifer Sagalyn, Director of Business Development; Joann Becker, Manager Access Technology Training; Taylor Snook, Digital Accessibility Consultant)
    Whether an assistive technology user is navigating to a bus stop or to a web page, Perkins knows how to make the world more accessible by leveraging inclusive design. Joann, a Perkins staffer who is blind, will demonstrate how Perkins went from recognizing the micro-navigation challenge of finding a bus stop to building an accessible mobile app—BlindWays—that picks up where GPS leaves off, guiding travelers to within a cane’s distance of a bus stop sign.Inclusive design applies to websites too. Taylor Snook will share how Perkins Access considers the needs of users with disabilities during wireframe template and product requirement reviews to save time, effort, and money. Taylor and Joann will also highlight how including the “user” in user testing is essential to building a better product for all, as well as complying with accessibility standards and policies.
  • SNaSI: Wearable Tech to Help the Blind and Visually Disabled in Face-to-face Interactions (Rébecca Kleinberger)
    MIT Media Lab and Microsoft Research have created Social Navigation through Subtle Interactions (SNaSI), a wearable system designed to help blind people in face-to-face interactions.Assistive systems for the blind have been an important area of research in the wearable community for many decades. Most of those systems focus on spatial navigation issues. In the last few years, we start noticing a move toward technologies to assist the blind with social navigation. Most of those systems treat social navigation the same way as spatial navigation, focusing mainly on utilitarian aspects of human interaction (what is needed to obtain information, what information is exchanged, etc.).In our research, we determined that we needed to think about social accessibility and respect of human connection first. Most of the time, face-to-face interaction aims primarily to create and reinforce human connection rather than exchange information. We focused on the importance of designing with subtlety in regards to framing, reasoning, and design challenges.During this session, learn how our design criteria were guided by subtlety and social acceptability.
  • Ableism, AI, Filtering, and Assistive Technology: Take Off the Digital Blindfold (Sassy Outwater)
    Relying on AI for captioning of images while using a screen reader means that a disabled person is subject to developer filtering, AI learning from its creator’s biases, and choices in what the AI will describe. Filtering equates to sensorship, “ableism” by any other name.Filtering: “You’re not my mom”! You are an AI developer. When content is suggestive, pornographic, violent, or otherwise deemed inappropriate by a company or entity, many AI creators simply tell the AI to describe as “adult content.”Aren’t I a blind adult? Keyword: adult. What about disabled peoples’ choices of entertainment frightens creators so much that they feel the need to filter content out? Legal? Societal? And where does that intersect with ableism and digital accessibility best practices? How do we teach AI not to carry over human assumptive behaviors and biases about users’ autonomy, content choices, and interactive methodologies?
  • Cognitive Web Accessibility: Eye Tracking, Machine Learning, and App Development (student panel)