R.91.129

Developing Avatars for Magic Leap 1

2017 - 2018

I led product for Magic Leap Avatars. We created avatars with highly expressive eyes and faces, driven by rich user sensing, to evoke playfulness and a deep sense of copresence. 

Behavior

Only a small fraction of communication is what we say. The rest we communicate through our body language: head movements, the expressions in our eyes, gesticulations, and proximity. These nonverbal behaviors give us the sense of real presence: the more embodied a communication medium is, the more presence we feel.

Magic Leap enables rich user sensing through direct and inferred tracking. This sensor input drives avatars' expressive nonverbals. 

We chose a "purist" approach to the character design of the avatars, limiting their visual fidelity to the fidelity of user input. This choice let us develop highly expressive but stylized avatars, avoiding an uncanny valley. 

Diversity and Inclusion

Representing rich skin tones in a mixed reality, additive lightfield is an artistic and technical challenge. We were inspired by the rich color, artful contrast and innovative lighting of dark skin in Moonlight (2016) as we developed inclusive materials and lighting to beautifully represent people of color.

Moonlight (2016)

The lead visual designer for avatars, Lorena Pazmino, used Pantone skintone swatches to rigorously test on device. 

We enthusiastically committed to representing underrepresented features, including developing a range of inclusive hair styles for avatars. 

Personalization App

We developed an Avatar personalization app to enable users to personalize the look, colors, and accessories of their Avatar. 

Samples from the cutting room floor 

Demonstrations of process and thinking

Example: Roadmap for avatar realism presented to leadership

Example: High-level roadmap produced with technical teams. Love a good roadmap.

I still love this concept! Avatar Auras - I developed and proposed the concept of the Aura as a secondary and supporting element to avatars to ease transitions where Avatars' bodies would de- or re-materialize (leaving a call, walking through walls, gestures flickering, whathaveyou). The aura could also add a new expressive dimension to Avatar behavior, enabling us to visualize user sensing like pupil dilation and heart rate. 


I worked with my team to further develop concept art and prototypes. We ultimately decided the idea needed more work than we had the time.

We were inspired by the stylized flocking behavior from Finding Nemo and the particle effects in Method Studio / Major Lazer's "Light it up" . I developed hand-drawn animations to illustrate concepts with my team.


Key Contributions

Strategy - Provided initial vision, and conceptualized the approach to avatars (highly expressive, stylized human avatar, assets developed in-house).

Design Leadership - Interviewed and hired 3 full time artists, and negotiated a collaboration with ML Studios for art support. Established and championed core design principles for the project, including adult proportions and inclusivity. Held weekly concept and asset reviews alongside our visual design lead. Supported the team user testing the concepts.

Execution - Lead a cross-functional team to integrate the assets into the avatar behavioral engine, operated cross-functionally to support end-to-end turn on of an ambitious set of sensor dependencies.


Dream Team 

I had the honor to partner with a singularly gifted, ride-or-die team on this work: 

Visual Design - Lorena Pazmino (lead), Ian Mankowski, Christina Lee, Joe Olsen
Interaction Design - Karen Stolzenberg (lead), Cole Heiner, James Powderly (director of UX)
Engineering - Richard Bailey, Alex Illic, Koichi Mori, Tomislav Pejsa



Using Format