Apple and Columbia University have unveiled SceneScout, a research prototype that uses AI and Apple Maps to help blind and low-vision (BLV) users explore environments through rich street-level descriptions. It offers two modes: Route Preview for guided path descriptions and Virtual Exploration for open-ended neighborhood browsing. While not yet a wearable, the project hints at future AI-powered accessibility tools. Early tests showed promise, though accuracy issues and user preferences for real-time, on-the-go info were noted.

Source: 9to5Mac