Overview
Team
❖ One Developer
❖ Three artists
❖ One lead UX Designer / Researcher (me)
My key contributions
❖ UX Research
❖ Project Management
❖ 3D Modeler
❖ World Optimization
❖ Tester
Process❖ Define problem
❖ Test
❖ Prototype & Implement
❖ Test
Tools used
❖ Oculus 2
❖ Horizon Worlds
❖ Discord
The Problem
Users (players of the Horizon Worlds VR app), don't return to worlds they've visited. Many users, particularly new users, don't finish experiences before leaving to do something else.
This suggests problems in replay-ability and player engagement of experiences in Horizon Worlds. From initial user interviews held within the app these issues are rooted in two key usability issues: clarity of player mission(s) and creating moments of joy, or fun, throughout an experience in Horizon Worlds.
How might we:
❖ Clearly indicate world task(s)
❖ Build with replay-ability in mind
❖ Create moments of joy throughout the experience
Insights and Proposed Solutions
Qualitative research indicates the below factors have the largest effect to user experience for VR dungeon-crawler experiences:
❖ Enemy type / enemy attacks
❖ World layout
❖ Weapon choice(s)
Based on the team's problem of ensuring mission clarity and replay-ability I provided the following recommendations based on initial user research.
❖ Aim for enemy and level variation for each world area. This recommendation will:
- Create excitement
- Reward users for reaching new areas
- Hint players at changing world events / change in world task(s)
❖ Allow players to choose from at least 3 weapons. This recommendation will:
- Greatly add to replay-ability
- Cater to differing user fighting style(s)
- Supports accessibility in VR
❖ Utilize a player score/level system to tack best plays and user progress. This recommendation will:
- Create initiative for player competition (a strong motivator for dungeon-crawler games)
- Adds to world replay-ability
- Adds a layer of strategy on which weapon(s) perform the best
Results
Playtests held at different points of the project provided useful, and distinct, feedback utilized to inform world layout, enemy creation, and design decisions. Below are some of the results incorporated to the beginning, middle, and final design decisions:
Beginning of the project:
❖ Create more enemy types (currently there are 3), with varied attacks
❖ Change design of the weapons and update the level design away from template designs
❖ Create separate atmospheres for each area
❖ Add respawn points for users that fall off platforms
Middle of the project:
❖ Add atmosphere elements to all areas
❖ Create a hint to solve puzzle in area 2 and add sound cue(s) upon completion
❖ Spawn more enemies throughout
❖ Create more enemy types
❖ Add more scenery to area 1 and the end of area 2
End of the project:
❖ Shorten walking in area 1
❖ Raise difficulty in area 3
❖ Use lighting to create carnival/world's fair atmosphere
**Click here to skip to my learnings from this project or feel free to view details on my process and how I arrived at the solutions and results above**
Working with the team
Initially, I held a meeting with the team to discuss their vision for the world and expected needs of the project. I also facilitated the creation of a map (very similar to wireframing in traditional UI Design), to breakdown the project into 'areas' we can encapsulate to their own mechanics, tasks, and atmosphere. Each 'area' held a list of level design ideas and goals. The map served two goals:
(1) To keep myself and the team alined on the function and events tied to each area
(2) To help the team use a design behavior perspective as we continually update the map with new/changing design decisions.
Process
I like to cater my process based on the medium and team needs. This project called for an iterative, mostly qualitative, approach:
(1) Analyzing a need/pain point through qualitative (and some quantitative) user research
(2)prototyping
(3)testing to evaluate progress.
The client wanted to create a dungeon-crawler experience soI decided to take a qualitative approach in my UX work, hosting live group playtests (aka focus groups) of 2 -4 users at a time. Although it took more time and coordination for this approach, I believe it best suited the needs of the project considering the product is highly dependent on understanding the ebb and flow of VR fighting mechanics and world-traversal; two topics best understood through live observation.
Playtesting
I chose a qualitative research method in order to attain direct observation and realtime reactions to world tasks and enemies interactions. I created a couple of flexible user interview scripts based on:
❖ Who was playtesting (the team vs users)
❖ The development state of the world at the time
❖ My UX findings so far (digging deeper into previous findings)
Team Playtesting
When playtesting with the Made in Brooklyn Games team, I suggested playing the experience at least twice to allow me as a UX Researcher to test for replay-ability factors and drive the team to catch issues or bugs that arise in the building process. I'd also facilitate a short discussion based off of a dev playthrough interview script. I was intentionally listening to feedback in a constructive frame of mind and asked followup questions (typically based on where we were in the development process). For example:
(1) What jumps out at you from this play?
❖ I find utilizing this open-ended question surfaces moments of strong impressions, both positive and negative.
(2) What point was the most fun in this area?
❖ This question was key to understanding player tasks/world events that consistently creates moments of fun, and those that don't.
(3) Does anything function, or feel, differently than you expected?
❖ This open-ended question is useful to find pain points from both a player and developer point-of-view. A lot of great constructive conversations arose from this inquiry alone.
User Playtesting
When playtesting with users, I invited a mixture of users I've met in Horizon Worlds, friends of the developer, and utilized optional play sessions hosted by Meta. I created a short user script with key questions for each area of the world. I also created a set of guidelines for myself to make the best of playtests. My personal UX guidelines were:
❖ Allow other players to choose their weapon first - inquire on their thoughts on how weapons look and feel
❖ Always wait for player to enter each area before I enter
❖ Do not help players when stuck or confused in any area - observe and inquire on confusion/issue
❖ As we move from one area to another (the experience consists of 3 main areas), listen for feedback - inquire on first impressions
❖ Allow players time to fully share thoughts, ask for any additional thoughts before moving on
During the final playtests, I held all questions until the end of the experience in order to allow players full immersion and took quick notes on anything plays said or did during the experience. A couple of questions per area below:
Area 1
(1) What jumps out at you from this area?
❖ I find utilizing this open-ended question surfaces moments of strong impressions, both positive and negative
(2) What is your expectation for this world based on what you see so far?
❖ This question digs into clarity of task/mission early on in the experience
Area 2
(1) Did you have any strong reactions to anything so far?
❖ This question also surfaces moments of strong impressions, both positive and negative reworded to keep the conversation engaging.
(2) How was moving from the previous area to this one?
❖ This question investigates clarity of task/mission, as the second area swaps from a fighting task to a puzzle task.
Area 3
(1) Thoughts of this experience so far?
❖ This question aims to invite open feedback, inviting players to share feedback of clarity of task/mission, comparisons to other areas, or mechanics within the experience.
❖ I found at this stage of the experience users have been playing for at least 15 minutes, creating ample time to notice issues, gauge replay-ability, notice any mechanics causing physical/mental fatigue, and point out anything that might be missing or was expected from the experience.
(2) How's the fighting mechanic feel?
❖ I wait until area 3 to ask this question as it's the area that uses the fighting mechanic the most, and users had the chance to fight different level enemies at this point.
Return to start
(1) What are your thoughts of this experience?
❖ Open-ended question inviting users to share strongest impressions.
(2) Did the world match what you were expecting?
❖ This question is geared towards player expectations and clarity of task/mission throughout the experience.
(3) Anything you'd change or add?
❖ This question is designed to record moments of fun currently missing, or mechanics/actions that create friction.
($) How likely are you to want to reply this experience?
❖ This is a strictly replay-ability question.
Blockers
❖ The biggest blocker on this project was a world capacity limit set on all Horizon World experiences, particularly due to separate atmospheres in each area, multiple enemies, and the addition of another artist later in the development process.
- To resolve this blocker, I also functioned as an optimization developer; investigating capacity for world objects, rebuilding structures to use objects that utilize less object percentage/ different object types, and the optimization of enemies.
❖ Scripting bugs, particularly affecting enemy spawn points.
- To resolve this blocker, the team was great about communicating with me when bugs were present. During these times I wouldn't make changes to any enemies or, if applicable pause playtesting sessions, until the bug was resolved. During these times I shifting my focus to other tasks.
Learnings
❖ The power of group testing sessions/ observational methodologies is immeasurable — it take more time and effort to put together multiple user playtesting sessions but they were 100% worth the time and effort, and they were really fun for me as a UX Researcher.
❖ I'm naturally excited by high levels of experimentation, especially during early stages of a project. This project reinforced the benefits, particularly in high level of experimentation in enemy creation, by the end of the project. For example, through playtesting I found the most common lasting impression was the 'cool '/ 'fun' / 'crazy' enemy types throughout the experience. Having a high sense of creativity in enemies, I believe, also inspired others to become more creative in world-building on this project.
❖ Through this project I gained deep experience in 3D modeling in VR — I had a lot of fun and I'm proud of the level-building /enemy-building skills I gained. I'll definitely take these skills with me to future projects.
❖ I had the pleasure of signing up for a 1:1 session with a lighting expert in Horizon worlds that helped me grow my skills in building atmosphere with lighting in a 3D space.
Want to see more of my work?
If you like what you see and want to work together, get in touch!
alicia.marisal@gmail.com