top of page

Anivision

As part of the Digital Applied Learning and Innovation Lab at Dartmouth, I work as a senior developer with other developers, designers, and a project manager on Anivision: an NSF-funded VR Project exploring how different animals see and sense the world.

This March. we published Anivision on App Lab for Meta Quest 2 and 3. You can check it out here!

Focus on Vision and Movement

Anivision introduces an animal's traits over time as the player progresses through a minigame-like structure. As their vision changes, the user gets a very tangible understanding of how an animal's traits help the animal understand and navigate their environment. 

Below is a demonstration of how we give the first trait to the player, to introduce the idea, as well as an example of climbing.

Minigame Structure

To allow for quick prototyping, as well as build a simple framework to add animals to, Anivision separates the experience of different animals into short minigames. Below is the trailer for the game, as well as a few screenshots displaying the different visuals representing animal traits.

In my time on the project, I've been largely responsible for the implementation of movement systems, creating new representations of senses, and significantly improving performance.

I led both the development journey and the eventual release of the project, guiding it to its current state.

bottom of page