A gesture based spell casting game using Intel RealSense, created in Unity.
Photoshop, Unity 3D - specifically 2DToolkit, Intel RealSense
In 2013, Myself and my husband created a prototype for a gesture based spell-casting game. We entered this into Intel’s Perceptual Computing Challenge and won first place in the game category, and second place overall. This was a huge win for us, and allowed us to partner with Helios Interactive to create our studio, Livid Interactive. Livid continued development on the game for about a year, as part of Intel’s RealSense Showcase, which was demoed at CES, the Game Developer’s Conference, and IDF.
I designed 6 playable characters for the game, and oversaw their 3d creation and animations.
I also created the UX and UI. Hand tracking applications were still very new, and I had to keep in mind that most of our users would be doing this for the first time. We found throughout all the demo experiences, that it was hard for first time users to understand the shapes they were being asked to make, and what the accuracy was based on. In most applications, an action is recognized by touch or by clicking; you have a very clear point of touchdown, length of touchdown, and lift off. With gesture, you don’t have this, so we had to come up with something creative. For Head of the Order, the “touch down” was when the hands started moving quickly, and the “lift off” was when they stopped. Every movement in between those points was considered to be the shape they were drawing. This method increased player accuracy. We also created short video captures of a spell being done correctly. A player could reference these in the training area if they were having a hard time getting the results they were after.
Each environment was designed around the story of one of the characters.