ENCODED

Digital Intervention at The Met

ENCODED is an Augmented Reality Intervention envisioned by Amplifier with technical execution created in collaboration between EyeJack and Amplifier. The experience launched on Indigenous Peoples’ Day 2025, taking over The Met’s American Wings. Seventeen Indigenous artists from across Turtle Island use AR and sound to reclaim space, expand narratives, and activate the museum in new ways. The experience guides visitors through 25 artworks, bringing each intervention to life when users point their camera at paintings, sculptures, or immersive works.

The Art

ENCODED features works by: Amelia Winger-Bearskin, Bear Fox, BirdxBird, Cannupa Hanska Luger, Cass Gardiner, Demian DinéYazhi´ (fabricated in collaboration with Lite Brite Neon), Flechas, Jarrette Werk, Jeremy Dennis, Josué Rivas, Katsitsionni Fox, Lokotah Sanborn, Mali Obomsawin, Mer Young, Nicholas Galanin, Priscilla Dobler Dzul, and Skawennati.

Each artist brings a unique perspective, engaging with The Met’s collection through humor, ancestral presence, and social commentary. The exhibition was designed by Amplifier and co-curated with Tracy Renée Rector, and made possible through the support of an Indigenous funder, Pop Culture Collaborative, and a multicultural, multinational team of artists, technologists, organizers, and collaborators.

Mer Young (Chichimeca & Ndé (Apache)) We'wah Lhamana (2025)
Mer Young (Chichimeca & Ndé (Apache)) Deb Haaland (2021)

Process

Amplifier led the creative vision and coordinated the artists, while EyeJack provided technical support and built the AR experience, including all UX and UI design. Given the scale of The Met and its complex layout, we developed a unique wayfinding and onboarding system to guide visitors through the experience. This involved balancing a planned user journey with flexible design solutions for real-world scenarios, and we collaborated closely with Amplifier on on-site tests using 8th Wall’s tracking capabilities for VPS tests on sculptures and sky segmentation experiments.

After our various tests the final AR experience uses a combination of VPS tracking for sculptures, 2D image targets for paintings, SLAM tracking to bring artworks to life. Users can view various maps of levels to interact with 25 target artworks via mobile devices, experiencing sound, animation, and 3D effects that expand and reframe the museum’s narratives.

Enquire Now