top of page

VR Design Document

Update
Four (Final)
Screen_Shot_2021-06-22_at_22.17.25.png
Screen Shot 2021-06-24 at 15.36.05.png
interaction.png
Screen_Shot_2021-06-17_at_18.25.50.png
Screen Shot 2021-06-17 at 17.39.10.png
Screen_Shot_2021-06-22_at_23.30.19.png

1. Title of project

Dolphin Bay: A Disturbing Beach Day

​

2. Razor

A different kind of “beach day” to patrol the beach and collect garbage as well as helping out wildlife as a marine conservationist.

​

3. Slogan

Be the solution to beach pollution.

​

4. Vision statement & top level summary of your project idea

The beach is full of sunshine, but sometimes full of garbage as well. Our participant will become a marine conservationist and experience what it’s like to be handling the garbage on the Dolphin Bay beach and saving marine wildlife from the physical harm of pollution. During the experience, the participant will communicate with his co-worker and receiving instructions via his mobile phone. We would like the participants to engage themselves in an entirely different experience and help them create empathy for marine wildlife.

​

5. Goal of project/Differentiator

The goal of the project is to help our users realize the seriousness and terrible consequences of pollution. The unique experience within our project can help our users develop or increase empathy towards wildlife and understand that they have the ability to help reduce pollution. We hope this experience ultimately transforms our users to become more pro-environment in real life.

​

6. Themes of project

The period of our project is contemporary because the problem of pollution and the harm it brings upon marine wildlife are happening in the present day.

Our themes are education, empathy, and transformation because we want to use this experience to educate our participants about the consequences of pollution, help them develop empathy towards marine wildlife and disapproval towards pollution, and ultimately transform them to be more pro-environment in the real world.

​

7. Visual style of project

The visual style of our project is semi-realistic. We want our environment to be visually stunning but at the same time make it realistic enough for the participant to relate to the real world. Part of the inspiration came from an immersive game called ABZÛ, which has been widely praised for its artistic style and beautiful environmental design. We will use soft lighting and warm-toned texture to make the whole environment pleasing and relaxing so that our participants will be pleased to stay and explore.  The litter and the sight of the dolphins will also create a contrast with the common beach theme of peace, relaxation, and pleasantness.

​

8. Core desired user experience

a) Desired user experience and how your VR experience ideally transforms immersants 

Our users should feel annoyed by the amount of garbage on the beach and how difficult it is to clean it up. They would also feel sad when discovering the dead dolphins and an urge to help out the dolphin that is suffering from getting choked by a plastic bag on the beach. We expect that from this experience, the users would develop empathy for ocean lives and disapproval towards pollution, and ultimately become more pro-environment in the real world.

​

b) How is your project taking advantage of the special affordance and opportunity of VR? 

Many people go to the beach, but our project will show you the other side of the beach. Our participants will explore the beach that is scattered with garbage produced by humans from the perspective of a marine conservationist, they will have the opportunity to meet the marine wildlife harmed by pollution and even save a dolphin by interacting with it. This will be an experience they don't often have in real life.

​

c) Relation to course design challenge

This project is designed to bring pollution into the spotlight and help the users to see from a different perspective. We want people to realize how easy it is to neglect the damage humans are doing daily and that simple actions such as picking up the garbage can make a big difference. Through the view of dead dolphins killed by human pollution and the action of rescuing the alive dolphin on the beach, the users can develop empathy towards the wildlife and loath towards pollution. Thus, transforming them to become more pro-environment in real life.

​

9. Introduction

Our participant will spend their day as a person who is taking over his co-worker’s shift as a marine conservationist in Dolphin Bay, an experience that many people never had or never thought about having before. They will also get the chance to explore what should have been a beautiful but heavily polluted beach. They will also witness deceased dolphins due to the pollution and rescue the dolphin that is struggling to breathe while choking on a plastic bag.

​

10. Narrative/Story 

In the early morning, MC, a marine conservationist, receives a text message from a senior co-worker called Dave asking him to take over a shift at a beach called Dolphin Bay. MC accepts the assignment and arrives at Dolphin Bay shortly after. Although a beautiful beach, Dolphin Bay is heavily polluted. Human litter can be seen everywhere on the beach, and the pollution also caused the death of several dolphins that were washed up on shore. When cleaning up along the beach, MC sees an alive dolphin that is stranded on the beach. The poor creative cries helplessly while choking on a white plastic bag. MC asks Dave for instructions, gently pulls the plastic bag out of the dolphin’s mouth, and pushes the dolphin back into the ocean. The dolphin swims away and jumps out of the water happily as if a way of saying thank you. After the rescue of the dolphin, MC has the option to leave the beach right away, or stay and continue cleaning up the beach.

​

11. Storyboard

​

​

​

​

​

​

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

12. VR mechanics & Physical Rig

  • Core mechanics

Our participants would be using the keyboard and mouse or a controller to move and control the actions of the avatar in the virtual environment. Participants will have a linear storyline to follow along, with the optional task of picking up the garbage along the beach as well. The participants should develop the feeling of loathing towards littering and develop empathy for the dolphin during or after the experience and change their perspective on the impact on marine wildlife by human pollution.

​

  • Secondary mechanics

There will only be one user in our VR experience at a time. Our target audience is generally anyone familiar with using computers, mice, and keyboards. The physical setup would include a small room made out of some cloth wrapped around metal stands. There would be a projector projecting an illustration of dolphins in the ocean, and a few light bulbs at the top to simulate the brightness and heat on a beach. The user would be instructed to use the keyboard and mouse or a controller to participate in the experience.

​

  • Envisioned physical setup

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

Our physical setup will be a booth built with a rig from the SFU Surrey campus and a dark cloth that covers 3 sides of the rig. There will be three light bulbs and a small projector installed at the top of the booth to project a silhouette of a dolphin. The bottom will have a custom platform built with plywood, wire nails, and hot glue. Real sand and a few pieces of garbage will be placed on the platform for the user to stand on for a more immersive experience. There will also be garbage placed outside of the booth after the user finished the experience to test the user’s response.

​

  • Locomotion technique

a) Keyboard & Mouse: We decided to go with this option because it is more accessible for our team and easier to implement. More people are familiar with them than controllers so it is easier for us to target a broader audience. Using a keyboard and a mouse is indeed less immersive because the binary inputs of the keyboard are less natural and limit the desired input. It is also less comfortable to use because the user has less freedom to move around, but because of our technical limitations that prevent us from using HMDs, we decided to go with this locomotion technique.

b) Reticle in the Centre of the Screen: Since we are using a mouse for the participants to look around the virtual environment, we need something that can allow them to select the objects in the scene. We chose to create a reticle in the centre of the screen and use Unity’s Raycast feature to achieve this feature. Even though it is much more complex to implement in the experience compared to using a visible mouse on the screen to select elements, we still decided to go with this option because we want to use our participants’ affordance of playing traditional video games like Minecraft for a more immersive feeling. Furthermore, a visible mouse on the screen that will move around on the screen can break the immersiveness of our experience.

​

  • Pre- and post-VR-experience

The user will be greeted by our team outside of the experience booth where we would explain that this experience is from a point of view of a marine conservationist, the setting is a beach, and the experience is about pollution. When the user enters the booth and sees the setup, they will have an idea of what the virtual environment would look and feel like from the installation such as the dolphin projection and the sand.

After the experience, the user may feel sad about the harm pollution brings upon animals and feel a stronger sense of responsibility towards reducing pollution. We will intentionally place litter such as a soda bottle or a plastic bag on the floor to test the user’s response.

​

  • Other stimulation

The user will feel sand beneath their feet during the experience. The lightbulbs will provide enough heat to simulate the heat from the sun on a beach.

​

  • What the HMD-based version would look like

Our team was not able to develop this experience with HMDs due to technical limitations, however, if we were to use HMDs for the virtual experience, we would be retaining most of the features in the virtual environment, with just a few changes to the interactive components. Firstly, we would use the controllers that come with the HMD for the movement and interactions in our VE instead of a keyboard. We would use the controllers to simulate the hands in the virtual environment and pick the garbage up by pressing a button on the controller for a more authentic and immersive interaction. Secondly, we would incorporate both walking and teleportation in our system such that our participants can choose whichever way they prefer or feel less motion sick. The reticle in the centre of the screen would also be removed. This way, it would have more natural interactions and overall experience compared to using a keyboard and a mouse to navigate and interact in the virtual environment.

​

13. Inspiration Analysis

We came up with the idea of this project after a discussion about what kind of experiences are not overused, doesn’t involve violence, would be transformative, and is possible to be created in Unity. Part of the inspiration came from an immersive game called ABZÛ. In this game, the player plays as a diver exploring ancient technological sites and submerged ruins surrounded by underwater environments teeming with flora and fauna. The game focuses on exploring aquatic animals that peacefully interact with humans, and has been widely praised for its artistic style and beautiful environmental design. All our team members are concerned about environmental protection, so we hope to provide people with a meaningful and transformative experience through aesthetic design and interactive experiences like ABZÛ.

Additionally, our storyline was inspired by the game Firewatch. The protagonist in that game took on the job of a national park ranger to temporarily escape from his reality of dying wife. The game not only has a fascinating and thought-provoking storyline but also allows the players to get to know the duties of a national park ranger. Similarly in our experience, our participants get to experience what it’s like to be a marine conservationist and handle the situations that come up on the beach. To narrow down our idea, we engage ourselves in an ideation process called brain-dumping. We generated many ideas in a short time, and after further discussion among ourselves and with Noah, we came up with our current idea.

​

14. Immersion Frameworks

a) In what way will your project support immersion, flow and/or presence etc.? 

Our physical setup along with the sound design will block out physical inputs and make the user feel like they are actually exploring on a beach. The objects such as different kinds of garbage and the dolphins will attract the user’s attention and prompt them to interact with the environment without making them feel forced. We believe these will make the user feel like they naturally belong in the VE without the distraction of the physical world, thus supporting immersion and flow.

​

b) What type of immersion are you focusing most on, and why? How do you plan on using these to support your overall project objectives and desired user experience? 

Our project mainly focuses on sensory immersion and imaginative immersion. Our physical rig is a booth that is specially designed with props such as sand, lightbulbs, and a projector to simulate our virtual beach environment. The booth can also block out the physical inputs from the real world to a great extent. Along with the help of the sound design, the sensory inputs of the virtual environment will help the user achieve sensory immersion. As the user progresses in the experience, the appealing visuals, the realistic audios, and the interactions with the garbage and the dolphins will keep the user deeply involved with the experience and help the user reach imaginative immersion. Accomplishing these two types of immersion can help us keep the user engaged and feel that they are present in our virtual environment. Challenge-based immersion is also present in our experience such as cleaning up the garbage and rescuing the dolphin. However, these activities are not designed to be too challenging to distract the users from the plot and interfere with our end goals.

​

c) Please explain in detail how your team plans on evoking your chosen immersion aspects.

We plan to manipulate the physical inputs to help the user achieve sensory immersion. Since our setting is a beach, we will install light bulbs on the physical rig to simulate the brightness and the heat of the sun. We will also use real sand and let the user stand on it during the experience to make it feel more realistic. We will also create a task that is at the right difficulty to keep the user entertained for challenge-based immersion.

​

15. Why Your Project is Innovative

(a) What’s new/interesting/cool/exciting/different about your project?

A beach environment is not an overused concept in virtual experiences, so its uniqueness will attract people’s attention and interest during the showcase. Our experience also has a well-thought-out story to contribute to the specific objective of reducing pollution and maintaining the wellbeing of marine wildlife. It has great potential to transform people to be more pro-environment on a broader spectrum in the physical world.

(b) Why is your project relevant? How does it provide a meaningful/desirable experience to the users?

Pollution happens everywhere every day, yet it is easily neglected by many people. Our project includes a setting that is much appreciated by the general public to provide an experience to help see the problem of pollution and become more aware of the hazards littering brings upon the ocean’s ecosystem.

(c) For your showcase, what would be your main “selling points”? Why should anyone care about it?

The main selling point of our project is that pollution happens every day and everywhere, even on the beaches many people love and appreciate so much. People should care because pollution has dire consequences, and we have to be aware of it, care about it, and take actions, even if they are as simple as not littering or picking up the litter, to help our environment before it is too late.

​

16. User Testing Goals and Outcomes

(a) Goals, Questions, and Hypotheses:

  • How engaging is our experience? Is the challenge appropriate for the experience that we are creating?

  • Is the background story clear to you? Anything confusing that we should clarify for our participants before the experience?

  • Is there anything we should add or remove in the virtual environment?

  • How engaging is the environment?

  • Do you explore the beach first before investigating the dolphins?

  • Does our experience make you motion sick?

  • What do you think about the avatar's movement speed?

  • How did you feel when you saw the dolphins on the beach?

  • What would be a natural way to end the experience?

  • What message did you get from the experience?

  • What would be a natural way to end the experience?

(b) Methods:

  • We presented our idea in the IAT445 lab on June 10 to the students and the teaching team. At the end of the presentation, we asked a few questions we had and the students and instructors also asked a few questions. The instructors then provided feedback on our project, like what we need to improve on or make more specific.

  • We conducted the user testing during our IAT445 lab on June 17 and June 22 to our TA and classmates by providing a build of the experience and a survey.

(c) Results:

  • The main feedback we got during the pitch was that we needed a specific storyline to make the experience more compelling and that we needed to engage the player emotionally instead of simply showing the instructions and telling them what to do. Therefore, we revamped our project to integrate a linear storyline and different interactions.

  • During our first user testing, we received positive feedback for the storyline and were suggested to add more interactive elements to the environment that we did not include in the build at the time. Only one person answered our survey and they think our experience is pretty engaging and is good at triggering empathy.

(d) Meta-reflection:

  • During our presentation, the most important feedback we got was that we need to have an easy-to-understand narrative in the experience. For our participants who have never seen our experience before, it can be confusing. Therefore, we have decided to change our approach to the experience and created a linear storyline for a more relatable, emotional, and immersive experience.

  • For our user testing, we have realized that although the additional storyline significantly increased the difficulty for us and we had to redo the virtual environment for the experience to be more believable and more transformative for our participants.

​

17. Prototyping process

First, we discussed and created a list of objects that we needed for the project. Then, we divided the objects to create individually in Autodesk Maya. Most objects were roughly sketched and then used as references to build the 3D models (Figure 1). Due to the importance of dolphins in our virtual environment, we created our dolphin model and rigged it in Maya (Figure 4). The animation of the dolphins was created based on the research of real dolphins’ motion. The dolphin keeps struggling until the participant pulls out the plastic bag that is stuck in its throat (Figure 2).

​

​

​

 

 

 

 

 

 

 

 

 

 

 

Figure 1. The sketches for the dolphin

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

 

Figure 2. Dolphin animation (struggling on the beach).

​

As for the actual scene, we created a large terrain in Unity and shaped it to be like a beach. Initially, we had labels hovering above the kind of garbage and showing what kind of garbage it is (Figure 3), but after refining our idea, we decided to get rid of it as it breaks immersion. We then modelled a lot of different kinds of garbage and scattered them around the beach randomly using polybrush for a more realistic feeling (Figure 6).

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

 

 

Figure 3. There will be a sound effect after the participant picks up the garbage on the beach.

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

 

Figure 4. The first few garbage we created in Maya and imported into Unity had texture issues.

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

 

Figure 5. Maya screenshot of our custom dolphin model rigged with animation.

​

​

​

​

​

​

​

​

 

​

​

 

​

​

​

​

​

​

​

​

​

​

​

​

​

Figure 6. Screenshot of the first room scene with glitched lighting.

​

Aside from the beach scene, we also added a room scene for a more realistic storyline. We wanted to keep the room clean and minimalist while still looking like a regular bedroom for our participants to recognize what it is easily (Figure 6). The room had lighting issues initially but it was fixed. We also added a phone on the nightstand that the participant can interact with, it also highlights itself to catch the participants’ attention.

​

​

​

​

​

 

 

 

 

 

 

Figure 7. The two HUD elements that we didn’t end up using.

​

Because of the concerns with the lack of directions in the experience, we added a few HUDs on the top-right corner of the screen to let people know what to do. However, we removed them for the final build as it makes the project feel like a game, rather than an immersive experience that gives an impact on the participants. We wanted the game to feel natural and more like a freeroam, rather than having specific directions in order to be as close to real-life as possible and increase the immersiveness.

​

18. Development Process

(a) Summary

During our assignment three, we first came up with the beach idea after several meetings with all of our team members and Noah. However, when we reached the SteamVR stage, we found out that our computers are not compatible with the Oculus Quest 2 headset, and the only team member in our team who has a compatible device does not have a strong enough processor to run SteamVR. We spent days looking up ways to make it work on the Internet, however every method we tried failed. We eventually had to compromise by removing the HMD component from our final project. Even though we had to change a lot of details in our experience due to the lack of HMD, we still used scrum boards in our discord group chat to coordinate our work.

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

(b) Discussion and reflection on your team process

We used the scrum board to keep track of the tasks we need to complete for the deliverables of the assignments every week. The board helped us to distribute tasks among team members and set deadlines for ourselves in order to finish the assignments in time with as many requirements met as possible. We used the scrum board in our discord group chat. Everyone updated the scrum board on what they completed and the progress of incomplete tasks. The scrum board helped us in updating each team member with what they needed to do, however, it did not help us to understand the time it would take to finish up the tasks we came up with for the assignments. The board helped us to coordinate our list of things to do and the progress, but it did not help us to keep the things we made consistent in terms of aesthetic styling. To solve that, we had frequent meetings and it helped with the quality of the assignments.

​

19. Critique

(a) What feedback did you receive?

From the project pitch, we received 2 valuable pieces of feedback. The first one is that our project needs a complete story with a good build-up to make sense of the experience. The second feedback is that although we cannot take advantage of the HMD, we still need to design for imaginative immersion to keep the user deeply involved and create compassion.

From the user testings in the lab, we received mainly 4 pieces of feedback. Firstly, the text messages disappeared too fast for the user to understand what was going on. Secondly, the interaction with the alive dolphin is too minimal, which is not enough to channel emotions and evoke empathy. Third, we needed boundaries in our environment so that the user would not walk off the edge of our virtual world or go somewhere they are not supposed to go such as below the ocean. Lastly, the garbage bins were not interactive so we should remove them to avoid confusion.

(b) How specifically did you incorporate them into your project?

Based on the feedback from the project pitch, we have modified and strengthened our story, as shown in the Narrative/Story section, such that the experience and its goal would make better sense to our users. We have also altered our physical rig concept and story world to accommodate and support the story and help the environment to be more absorbing and imaginatively immersive.

Based on the feedback from the user testing, we extended the duration of the text messages, added interaction with the plastic bag in the dolphin’s mouth, created new animations for the dolphin, and added boundaries in our virtual environment. However, we kept the garbage bins because we believe it serves the purpose of irony where the garbage could have been disposed of yet some people still litter.

​

20. Equipment needs

What kind of equipment will you need for your showcase?

From ourselves:

  • Plywood

  • A computer

  • A keyboard

  • A mouse

  • A pair of headphones

  • Rubber bands

  • A small projector

  • Light bulbs

  • Wires

  • Sand

From school:

  • Tall black metal stands

  • Chairs

  • Several pieces of large black cloth

​

21. Technical Documentation

How to run the application

  • Click on the application to start the experience

  • Use WASD keys on the keyboard to move; use the mouse to change direction and click to interact with objects

​

Overview of the architecture

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

 

​

​

​

​

​

​

External plugins and assets used:

 

Scripts created ourselves

  1. Rotating the object and playing animation on trigger

  2. Fade in/out transition when changing scenes

  3. Reticle Raycast and highlighting objects that it hits

  4. Phone flashing

  5. Picking up garbage animation using MoveTowards

  6. Ignoring collision between certain colliders

  7. Ending the experience/program

  8. Changing the centre of mass of the rigidbody

 

Problems encountered and solutions:

  • Problem: Making the reticle detect the object

       Solution: Using Unity’s Raycast and Vector3 calculation to retrieve the object

  • Problem: The animation of picking up the garage.

       Solution: Coroutine and while loop for a MoveTowards function

  • Problem: Highlighting of the hovered object

       Solution: Get the renderer component of the object, retrieve the default material of the object to the variable, and apply the highlighted material already defined. When it’s no longer selected, assign the default material back to the render.

  • Problem: Making the garbage disappear after it is collected

       Solution: Because the object is still there in the scene, Unity continues to retrieve the object’s renderer even if it’s already destroyed. So I disabled the mesh renderer after picking up the garbage and running through the animation.

  • Problem: When collecting the garbage on the beach, the garbage collides with the camera.

       Solution: An invisible object was added under the FPS controller that directs the animation towards it, instead of directly at the camera.

  • Problem: The garbage bouncing off the FPS controller

       Solution: When the mouse is clicked and the object is selected, the object’s mesh renderer gets disabled

  • Problem: The object not switching back to the default material if the user doesn’t pick it up

       Solution: Another transform variable was added to the script that contains the selected object, it will get the renderer component and assign the default material back to the object if it’s not picked up.

  • Problem: The user has superman powers and can pick up objects even if it’s very far away

       Solution: A limit of Raycast length was added to the Raycast part of the script.

  • Problem: Fading between scenes

       Solution: Animation of a black image was added before changing the scene.

  • Problem: The Raycast selects everything in the scene

       Solution: A tag “selectable” was created and the script will check the tag before performing the other actions

  • Problem: Making the dolphin swim in a specific direction.

       Solution: Instead of rotating the dolphin at a constant speed, a sphere is added in the ocean, and a script is created to rotate the dolphin towards the sphere so it swims towards where the sphere is placed.

 

Link to the executable: https://drive.google.com/drive/folders/1LWDgHf0fUYEZ7Vyg_gkLpiJoUYzaOj0M?usp=sharing

​

22. Appendix A: Documentation of Ideation Process

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

23. Appendix B: Sketches & Misc

​

​

​

Screen Shot 2021-06-27 at 18.12.18.png
setup.png
Untitled_Artwork 3.png
Untitled_Artwork 2.png
Untitled_Artwork.png
Screen_Shot_2021-06-22_at_22.17.50.png
Screenshot 2021-06-24 at 5.32.01 PM.png
Screen Shot 2021-06-10 at 18.03.10.png
Screen Shot 2021-06-24 at 15.38.50.png
Screen Shot 2021-06-10 at 18.02.45.png
Diagram.png
Untitled Diagram.png
IAT 445 Physical Rig.jpg
last.jpg
setup.png
Screen Shot 2021-06-03 at 17.03.18.png
Screen Shot 2021-06-03 at 17.03.39.png
Screen Shot 2021-06-03 at 17.03.55.png
Screen Shot 2021-06-17 at 17.41.15.png
Screen Shot 2021-06-17 at 17.41.45.png
Screen Shot 2021-06-24 at 17.26.42.png
Screen Shot 2021-06-24 at 17.27.22.png
Screen Shot 2021-06-17 at 17.42.02.png
bottom of page