I'm getting involved, with minimal formal training, in a cartographic effort. If you find it interesting, I'd love to get your advice and feedback.

A blind friend of mine is a Computer Science student at UT Austin. Using nothing but a white cane, she navigates half a mile of city and campus streets before arriving at the CS complex and going to the appropriate classroom. She has no maps for any of this journey. I'd like to provide some tactile and/or audio maps for her, using as much mechanization as possible.

Input Data

The plan is to use OpenStreetMap as the source for data on external features (e.g., streets, buildings). Conveniently, OSM stores and distributes its map data as structured files (e.g., XML), rather than images. So, we don't have to do any image recognition to identify features.

Mapping internal features (e.g., floor plans of the CS complex) is a bit more problematic. We have some simplified images and have asked for more detailed information, but we'll have to edit all of this manually (e.g., using Adobe Illustrator) into an accessible form.

Output Format

The planned output format is a set of "tiles", laser engraved from 1/8" thick, white-coated Masonite wall paneling. Here's a first cut at a layout description.

Although I have a rough idea about how to encode features for tactile access, the details are still up for grabs. For example, I know that bumps are easier to detect (by touch) than holes, so I plan to engrave away all but the geographic features, Braille patterns, borders, etc. Instead of spelling out the names of buildings and streets, I plan to use single Braille patterns, coupled with one or more indexes (which might be stored online and accessed via a cell phone or computer).


For more information, please ask questions and/or see my wiki: Access/WebHome, Access/Tiles/WebHome, etc.

-r