ux/xr researcher and developer
What does that look like? Most often that means videogames! But it also means chat rooms, forums, BBS and IRC. It means webrings and twitch streams, it's the ways we explore who we are using our shared digital environment. In service of that, I wear a lot of hats. I am an avid software developer in C, lua, Python, Java and the various proprietary languages that Godot, Unity and other development tools use. I do 3D modeling and animation with Maya and Blender to support my software projects and research as well.
michellevcormier@gmail.com
Coursework in pharmacology, abnormal and research psychology, managed experimental trials and subjects for Dr. John Salamone to assist graduate students' Alzheimer's research.
Worked with the Play and Interactive Experiences Laboratory as a graduate research assistant on numerous projects, many funded by the National Science Foundation. Assisted development AR software based in Unity to be used for experiments composed of numerous AI driven entities in a Unity based digital environment controlled by a human user with a peripheral device in the real world. (Locations were synchronized using geo-positional data, I worked on managing ros bridging between virtual AI driven drones, the Unity environment, and the human user.)
Only recently begun, work focuses on how human identity can be formed in digital environments and can develop there throughout a person's life. I am in the preliminary stages of designing a VR experience to use for study, based in Godot using low-cost body tracking devices.
Top: An 3d animated roomba, intended to be used as an avatar for a real world roomba. This avatar would appear above the real roomba when viewed through an AR unit. This design and others were used as part of a user experience survey in AI avatar designs for robotics.
Bottom: Gato from Chrono Trigger built in Maya and rendered out through Adobe After Effects, an example of modeling and rigging that I learned during my coursework in addition to animation.
Top: Unity server view of human user and one of their controlled drones. The intended user interface (as viewed through a head mounted unit) is mocked up to give an idea of what it will look like in production.
Bottom: Real world view of the control peripheral. Compare this to the peripheral shown in the Unity mock up. I worked extensively on the ros bridging between the emulated AI drones on the Unity server and human control through the real world peripheral. Interaction with goals, collectibles and hazards was handled through this client-server relationship which I also worked on to support the AR experimental design.
Top: During ideation and design phases I led my team in building multiple physical artifacts to represent potential interface elements. These were utilized in user experience surveys throughtout the design process in a rapid iteration production schedule.
Bottom: At the end of the semester production cycle we demonstrated a VR prototype of what was inteded to be delivered as an AR interface given a longer development cycle. This release allowed a user to turn on and off multiple devices with simple gestures, and make a coffee order with another suite of gestures. These gestures were also derived from previous user experience surveys.
Top: Empathy_OS, a game built in Godot that requires the player to utilize various mouse based getures (swipes, holds, clicks) to manage a variety of monsters and the player's vitals.
Bottom: Sweet Space Slide, built in Pico-8 made to emulate 8-bit arcade videogames but with a unique circular spinning control scheme. This design is intentionally minimalist to aid clarity with the Picp-8's 2 button control scheme.