For this project, I need to work with two constraint, use Node.js and my 3D avatar, combining my Live Web and Performative Avatar class.

I started by exploring how to put my avatar into the web. Three.js seems to be a good tool for this purpose. I found an in-depth tutorial that show to combine multiple animations in Blender from Mixamo and exports it as GLTF file with the Khronos Group glTF exporter.

After including the GLTF file, I played with the animations and joints system of Three JS. I freed up the parts of my joint, left shoulder, right shoulder, neck, spine, where I want users to have control despite the animations playing. The joint follows the position of moving mouse. Through the use of Node, user collaboratively create disruption on my avatar. The more people that are on the site, the more chaotic it is.

After this, I decided to include mobile device sensors to make the web page accessible to more people. I use Chrome API’s to get the orientation, motion, and touch event. So when user shake, turn, or touch their phone, the avatar reacts to them differently.


More on Three js