In this year's Smart Objects class at the MFA in Interaction Design program at SVA, I'm trying something new.
What if we don't kit out an entire room with cameras and sensors, but instead have some small, intelligent sensors you can talk to and move around with you as your needs change?
What if interaction designers could take ownership of their experimentation with computer vision and multi-level communication, with a little help from Claude Code?
Enter Orbit, Horizon, and Gravity. Three 16GB Raspberry Pi 5s, with a Luxonis Oak-D camera attached.
So far, they can detect people, estimate gaze, classify fatigue, and read a whiteboard.
As a start, they announce who's running which application on a shared Discord and each student has their own private Discord Bot to freely experiment without spamming the class Discord with notifications. You can ask them to change the configuration or change their current task through Discord messages.
As the semester moves forward, we'll add more ways to communicate. We'll decide which skills are useful and which skills feel invasive. We'll aim to have all data and interaction live locally and examine how this enhances privacy, ethical use of AI, and responsible power consumption.
I'll write here about how this experiment in distributed intelligence and shared design and development pans out. Hopefully, we will have more successes than failures.
We will certainly learn a lot.