Unit 2: Synchronous Session Questions for Dr. Chris Gilliard

I’m curious as to your thoughts on the impact of spatial computing on your existing discussion of luxury surveillance. With the release of Apple’s Vision Pro, which carries six external cameras (two forward facing, two downward and two side cameras) and can take 3D images, it feels like we may be inviting an unprecedented level of data harvesting into our lives in the spirit of entertainment. I’m particularly concerned about the user interface and navigation being driven by an intensely close level of eye tracking coupled with gestural activations and voice recognition. It all feels like an exponential increase in your concerns about other Apple devices such as the Tile or the Watch. At $3500 it’s also highly exclusionary. But it hasn’t happened yet. What are your thoughts on what this looks like a year from now?

I’m fascinated by your ideas of unintended consequence. My area of research concerns the emergent field of grieftech. The ability to digitally preserve ourselves beyond death, and the ethical issues which arise from interacting with those no longer with us. Those choosing to engage are willingly downloading as much as they can into these systems in attempts to help those left behind cope with loss. It’s a highly intimate experience, but one mirrored by the normalization of being able to bring back actors long gone or de-age our existing older performers. Much of your work is focused on the ethics of the real world, but what do you think of surveillance in the next? Do we already know how this story ends?

Weaponization of data isn’t new, but there’s stark targeting capability differences in the eight years of technological development between what Cambridge Analytica did in 2016 and what’s possible ahead of the 2024 election. In particular the recent democratization of generative tools. Authenticity concerns are everywhere, and it feels as if the truth is harder than ever to reach. With the rhetoric of indictment and candidacy already ratcheting up 18 months before the country votes, it feels like we’re in for a punishing, grueling election cycle. What do you speculatively think the role of tools such as ChatGPT, MidJourney and others will play over the next year?

I’m a product manager, and I know all-too-well the pressures of the Google Maps example you shared. I feel as if I’ve heard the phrase “what do we know, and what do we care about” many, many times over the years. Associative risk mitigation for users is something we think about in my teams all the time. What are we willing to absorb, and what are we going to make sure we address ahead of launch. Much of it comes down to our ability to sustain the business. We’re willing to absorb the risk of annoying users with ads because the revenue generated by them allows us to even have a product in the first place. But as advertising budgets dry up or move to bigger more efficient players, the challenge becomes one of engagement and habituation. And in that, simply staying in business becomes more and more dependent upon gaming attention.


Previous
Previous

Facebook vs. The Department of Housing & Urban Development

Next
Next

Unit 2 Reflection: Consent, Extraction & Consequence