Umi's World

XR game development

Umi is a XR game I'm actively developing for my house of TANK & BEAR. It's been a long term project that I've undertaken to explore what's possible with AR & Gaming.

Pitch: We’re building a game where players build and design small worlds from the resources they collect with their virtual pet. Together players build out a large location based game world.

Story: Umi is a game about small creatures rebuilding their world that was destroyed by an evil force called the Rifts. They’ve found their way to earth and pair up with us to help rebuild their world. The player designs a portion of Umi's home world from the resources they craft.

Funding: Umi has been completely bootstrapped by yours truly.

Team

Development: Bhanu Birani
Story: Myself
Art Direction: Myself
Character Design and Animation: Ashley Farlow
VFX: Harry Alisavakis
Game Design: Myself & Sandra Honigman

Outcomes

When the whole world becomes your playground.

The core gameplay is simple: collect resources at sites, use those resources to restore Umi’s home. Player’s use the resources they collect to rebuild a portion of Umi’s world.

Native AR is finally here!

Since the day we started working on Umi we’ve known that the native platform for Umi is not a phone but a headset. We’ve been following the rumor mill since 2018 each year hoping that we’d finally see Apple release an AR product…and they finally have. Below you can see  concepts for how our Vision Pro App will work.

Explore & Collect

Gather objects when you're out walking.

Craft what you collect

Uses those resources to craft new items.

Start in AR

The world starts simple.

Create a beautiful world

The more you collect, craft and design the more beautiful your world becomes.

Umi for Mobile

Though we’re excited about headsets we know that in order to win in the competitive market Umi needs to be on every device. Since we’ve been building the game on mobile AR a mobile app is the most natural other incarnation of the product.

Map: Explore & Collect

Gather objects when you're out walking earn rewards for steps.

Prototyping Phase 1 – planes

Learning through
prototyping

When we first started working with augmented reality the technology was limited to placing digital objects on planes that the camera could detect. Our goal was to bring the character to life. We began with the quality that signifies life to all humans: character movement.

The first character motion prototypes were crude we started with a simple tap and move control scheme that worked pretty well. The medium was so new that any little success was satisfying. We moved from ground planes to enabling flight and trying to land the character on tabletops and other surfaces.

Phase 2 – advanced locomotion

Walking Umi in the real world.

Prototypes can do more than just answer questions about usability or feasibility – they can define the vision for your product.

Everything changed during a test session when the character got stuck to a plane near my feet. It was a glitch, but it suggested that we could take the character for a walk in AR.

Our first version of this we controlled the character through moving the phone. An algorithm has the character running towards the center of the screen by finding the next available plane.

The advantage with this approach is that you don't need any controls to move the character. It's very elegant.

The problem is that the player needs to stand up in order to move the character. It can also create motion that is very jerky.

Phase 3 – advanced control

Walking Umi in the real world.

This prototype is one of my favorites. It illustrates our vision for Umi and augmented reality; where the whole world becomes an endless game surface; less needy Tamagotchi, more Adventure Time Jake & Finn. Umi is a reverse Katamari Damacy, where a whole world springs from a tiny creature.

To improve character locomotion we implemented a control scheme that used a joystick. We liked how much more control we got over the character by using a joystick.

This demo shows what it could feel like to walk around in the world controlling your character like pet on a leash.

Though you can make the player move should you?

There’s many times that a player simply isn’t going to get up and move around in their environment. We found that it was more engaging to be able to control the character exactly where we wanted it to with a joystick. It’s almost like navigating a remote control car.

Phase 4 – interactions

Creating behaviors

What’s attracted me to AR is how characters could come to life by interacting with us in our surroundings. One of our main reasons to switch to Unity was to gain more control over animation behaviors so that we could make the character more engaging.

We wanted to create simple basic interactions like feeding the character and it having some understanding of you. There’s nothing like the feeling that another creator is holding your gaze.

Creating a battle sequence

Ashley, one of our team members, suggested that bringing the rifts into the game would make the game more interesting by having an opposing force in the game.

We took a different approach to designing this sequence. We had our technical artist come up with mock up sequences in the Unity Timeline to build this sequence.

Battling in the real world.

After we built this mechanic we found it to be a quite satisfying game mechanic. Our vision behind the game is one that is more about creating than destroying so we’re still evaluating whether it fits into the game.

Phase 5 – Location

Maps, trial and error.

We tried out several different maps (Apple, Mapbox, Google) before finally settling on Niantic Lightship Maps.

There's so much that we learned about Location based gameplay that it merits a whole article on it.You can read the Medium Article I wrote about for a more in depth discussion.This video demonstrates the culmination of a lot of the work we've done: finding resources on the map and starting a crafting flow.

Our next set of work will be to develop the world building design tool that the player will use. Check back soon to see our progress.