Urban Technology at University of Michigan week 60
Recap of three experimental projects afoot this summer
Hi again. Did you think we would disappear after interviewing the Lt. Governor? Nah. We’re just getting going! As excited as we were to take a 6-week break, we’re happy to be back. How did August treat you?
This week we’re sharing quick glimpses of summer research experiments, with emphasis on the experiment part of that sentence. Michigan is a Tier-1 research university and that’s an important element of our institution, but for every major research project (and there are plenty) we have even more tiny little experiments in the labs and studios across campus. Sparks, if you will. This week we’re sharing a few of our own.
Hello! This is the newsletter of the Urban Technology program at University of Michigan, written by faculty director Bryan Boyer, to explore the ways in which technology is reshaping urban life. If you’re new here, try this 90 second video introduction.
Programming update: Moving forward we will be publishing roughly every two weeks instead of the weekly cadence from the first year (!) of this newsletter.
🙅 Experiment 1: Public engagement tools for things that don’t exist yet
How do you ask for input about a hypothetical? And how do you ask for input about a hypothetical when that possibility involves things that are unfamiliar? If the primary impacts of a future technology are visual, then you may resort to simulations of the physical/visual form, such as the use of fire to simulate the Round city of Baghdad some 1,000 years ago. A more modern approach would be VR, perhaps. What about things that are mostly invisible but have a large audio presence?
Drone logistics in cities have been promised to speed up delivery times of everything from burritos to pepto bismol, but at what cost? Our Urban Buzz experiment this summer set out to ask the question, how much change in the urban soundtrack are you willing to accept in exchange for convenience? We are exploring this by building a public engagement tool that uses audio-based augmented reality to layer in the sound of drone logistics on top of the existing hum of your city.
In this project we’re hoping to find a middle ground between “Drones?! Never!” and “Drones?! Of course always” by making it possible for people to engage with the sensorial reality of a hypothetical future where drones are frequently used for local deliveries. That’s a fancy way to describe it, but it boils down to an iOS/Android app, built on top of a game engine (Unity), and using headphones with "transparency” mode, to present a set of audio simulations. This summer our focus was on building the survey instrument as a proof of concept. Next summer we plan to conduct surveys with it and start to build a data set of public sentiment.
Thanks to recent UM grad student Christopher Walker for his efforts as lead developer on this project.
🍨 Experiment 2: How to use machine learning to get free ice cream
Continuing the theme of building tools to enhance public debate about technologies that are likely to have an increased presence in our lives moving forward, the second experiment tackled computer vision and machine learning. Working with my Taubman College colleague Anthony Vanky, as well as Meghana Tummala and Christian Wong, two U-M undergrads, we built a public art installation for the Detroit Month of Design called Face :Detector. The installation uses a handful of computer models to guess the age, gender, and mood of each passerby and then present this information back to the viewer in two different styles.
The installation is comprised of two screens. The light side shows an interactive game. If you and a bunch of friends are all happy, scoops of ice cream will stack up and you’ll achieve “party mode” where confetti streams down the screen. Whee! On the flip side of the installation is the dark side, showing a face by face assessment of the core attributes of gender, age, and mood. These are presented in infinitesimal detail, as if on a Bloomberg terminal of surveillance data.
The goal here is to expose some of the foundational principles of machine learning to a popular audience in a way that is playful and interactive. It creates a way to play with, and find the seams of, how the computer vision and machine learning operate together. Those seams are important because they highlight the limited reality of the systems in question. First, the models used by machine learning are based on probability not certainty—machine learning does not “conclude,” it can only “suggest.” Second, machine learning suggestions are reflections of their training data. When the training data is based on a simplification of the world such as a strict binary representation of gender, the computer sees only Male and Female and nothing else. Similar pitfalls exist for interpretations of race and just about anything else at which a computer might be asked to make a guess.
The point of the two-sided installation is to create a view into the “mind” of the computer that shows, one one hand the convincing nature of these systems by virtue of their apparently-seamless technical performance and, on the other hand, the fallibility and straight up quirkiness of systems that rely on computer vision and machine learning to make consequential decisions IRL, such as Project Green Light here in Detroit.
This project is viewable at the Milk & Froth ice cream shop on W. Congress Street in downtown Detroit (disclosure: my architecture firm designed the interior and branding for the ice cream shop) and will be up through the end of September. Check it out! We’ll post a follow up with photos from the install and more reflections on the experience in a future newsletter.
This project was partially inspired by MIT’s Mood Meter and supported by Taubman College and U-M Arts Engine.
👻 Experiment 3: ghost building gives up the ghost
Remember the ghost kitchen that we’ve been low-key obsessed with this year? It disappeared! This was not an experiment that we conducted, per se, but NBRHD (and parent company Reef Technologies) were certainly paying close attention to the data and have apparently determined that the Gratiot Ave location is no longer advantageous. Experiment #3 is theirs and we’re mere observers.
But do we know if it was a failure? Not really. Perhaps Reef discovered that it was such a good location that they want to covert from a hodgepodge trailer to a permanent kitchen somewhere nearby? The fact that there’s no mechanism to know one way or the other is a missed opportunity.
If the private sector is going to continue experimenting in ways that introduce new risk to the public, then perhaps there’s at minimum a way to compel such entities to share their rationale for where they land and why they depart? If the rules and regulations describing safe, typical operations of something like a restaurant are to be stretched for the sake of experimentations such as NBRHD, how can government take this as an opportunity to learn about perceptions and functional reality of the city by compelling some level fo data sharing? Or is the true ghostly nature of the ghost kitchen that it’s so off the grid that it does not even produce data that can feed future decision making?
Links
☎ Rotary cell phone, filed under whoaaaaa.
🚁 Wired took a deep dive into Amazon UK’s drone business. Is this the “collapse” or the bottom of the Gartner Hype Curve?
🚇 It’s not just drones that make noise, but trains too. NYT had a nice feature on train melodies this summer. 🎶
🚚 Dan Hill surveyed the small vehicles of Tokyo. Closer to home, some people are repurposing “meter maid” (parking police) cars.
🛴 Realtime-ish map of all the scooters in Los Angeles.
These weeks: happy to be back. Among the many things checked off the to-do list this summer was “get whiteboard.” Watch out, world! 🏃