Mobility design challenge and class project focused on downtown Culver City. We designed a light-projection system that suggests personalized activities and uses animated animal avatars to guide visitors as they explore, as the area is confusing for pedestrians due to poor wayfinding–especially problematic after sunset.


Team leader, UX Designer


Zeph Swart, Casey Montz and Sara Pope


Casey added some quick illustrations to the footage, to represent the projections



Culver City lacks an easy-to-navigate grid and the downtown is dominated by commuter car traffic.

During daytime visits we noted a great deal of ongoing and recently completed construction. Existing signage is static, sun-faded and road traffic facing, which doesn’t help pedestrians as they attempt to navigate cross-town traffic. The roads are wide and crosswalks are far apart; illegal pedestrian crossing is dangerous, but commonplace. The few main landmarks are obscured by poor sight lines due to building heights and trees. The walk from the Metro station and parking structures to shops and restaurants can be well over ten minutes after waiting for streetlights.

When we visited during nighttime hours, we saw all the same problems. Alarmingly, many pedestrians were using smartphones as navigation aids. The best lit routes were closest to car traffic, and crossing some large intersections required using multiple crosswalks.

1/2 mile radius, 10 minutes’ walk – CC Visioning Plan

Visioning Plan on how wayfinding helps the city – Zeph


The downtown restaurant, retail and major entertainment industry hub is surrounded by neighborhoods. As it happens, Culver City – an important stakeholder – had recently completed a comprehensive Transit Oriented Development Visioning Plan with input from another key stakeholder in attendance at community workshops and public hearings, the general public. The same pedestrian pain-points were documented within the TOD Visioning Plan.

Secondary Research


We decided to focus on wayfinding for people walking around downtown Culver City at night:

How might we create an engaging experience for pedestrians while reducing phone reliance for nighttime visitors of Culver City?


We began to investigate a light-based projection wayfinding system that could use real-time data to guide visitors toward unique experiences along pedestrian-friendly pathways. Our research had shown us that it might be possible to use IoT capable Smart Streetlights outfitted with 360º projectors.

Zeph & Casey walking a fish cutout

Translucent film for projection testing


We experimented with stick mounted props and silhouetted projections – shining a flashlight through a cut-out covered by colored translucent film. This was useful during bodystorming and while demonstrating interactions with avatar projections, but we moved on to prototype testing a mobile app.

Simulating projection


We started with a simple paper prototype. Testing revealed some first-time use, organization and naming problems. Users were successful at completing the primary task of discovering and launching projection routing, but the most significant iteration resulted from most users skipping the “Surprise Me” option. Post-test interviews led us to the insight that users should have the option to activate specific categories beforehand.

User testing a Lo-Fi prototype


We incorporated the testing insights as we finalized the primary user flow.

User flow in the final prototype – Casey


From our midterm presentation


The overall concept was positively received, especially the playful nature of the projections. Concerns were raised about how the projections might work in densely populated areas and if there was to be any functionality during daytime.

Slide Deck: PDF


Project reliance on evolving technology meant that we were left feeling like we were ignoring potential obstacles. Testing the projections on site, with actual pedestrians would have yielded additional insights and the interactive behavior of the projections was left underdeveloped. An in depth technical feasibility study of networked projection based routing would be an obvious next step.



The video was a group effort: Casey (camera, animator, editor), Zeph (voice, script doctor), Sara (actor), while I worked on the script, planned and supervised.


The logo evolved from one of Casey’s storyboard panels in which he sketched a bulb containing a crescent moon. I replaced the moon with shoeprints to convey the pedestrian aspect of our concept for the final logo.