Immersion and presence – Why are they important?

Testing is about gaining knowledge. To understand how to test VR effectively; we need to understand VR. In my last post I referenced a paper by Daniel R. Mestre; in this post I will go into what I’ve learnt from this.

So how do immersion and presence work together in the VR experience?

Presence is defined as the sensation of being in the virtual environment

We can think of presence as being a psychological quality. It is our perception of existing inside the virtual environment, it is subjective.

Immersion is capable of producing a sensation of presence

(Ijsselsteijn & Riva,2003)

Let’s think about this connection. Presence is the subjective feeling of being within a virtual environment, and immersion provides a vehicle for this feeling.

“The term immersion thus stands for what the technology delivers from an objective point of view”

(Mestre, 2005)

The connection should be clearer now. Presence is a subjective term; which covers how a user feels about the virtual environment; from a psychological point of view. Immersion covers what the technology can objectively deliver; to give the user a strong feeling of presence within a virtual environment.

Now who is best placed to “measure” immersion levels?

Well obviously I’m going to say testers. We’ve been doing something like this for years, but calling it user experience. Now I’m not saying testing VR is just UX testing, but it is about taking some of those principles and applying it to VR.

We cannot fully control how present a user is within virtual environments, but we can control how immersive a virtual environment can be. If we create an experience which allows complete immersion, then a user is more likely to feel present there.


References from the paper “Immersion and Presence” by Daniel R Mestre


Presence vs. Immersion

This article was bought to my attention. It talks about the concepts of presence vs. immersion, how they relate and cover different aspects of the VR experience.

I’m going to dive into this over the next few days and see how this knowledge can help improve my testing approach.

I’ll be back next week with an article covering what I learn.

Don’t worry, it’s only minor – Bug severity in Oculus Rift testing

Bug severity always raises different opinions. We’ve all submitted a bug and seen it edited down to a lower severity. Severity ratings become a loose guide to the nature of a bug. They can be useful, but seeing the 1-5 rating does not relate enough information on its own.


There are bugs that are a lower priority to fix, but there are no minor bugs when testing in VR.

*ANY* bug can break immersion.

Our aim is to give users the most immersive and seamless experience possible.

A bug may be minor in nature, but its knock-on effects are never minor. A user may recover immersion quicker from a less critical bug but that does not make it minor.

Immersion is totally possible, but only if we make it the smooth experience it needs to be.


Doesn’t look like that on my computer – The Oculus Rift version

So we’ve all been there…By we I mean developers and testers. The tester finds a bug and the developer says it doesn’t behave like that on their computer. So then you puzzle out what is different between the two machines; well that’s how it should go…

Now take that situation and increase the variables by at least a factor of 10. That is what happens when you bring Oculus Rift into the equation.

So I’m retesting a bug based around the position of the content. I briefly touched on depth of field issues in a previous post, but let’s get into it more. Where you position content in relation to the user is critical. If content is too close to the user; they will feel claustrophobic. If it’s too far; then they won’t be able to experience it as intended. If it’s a little too close; then it makes accessing content below the user’s resting eye level very uncomfortable.

In today’s situation the dev came around to watch me using Oculus Rift as I retested this bug. Not only did we realise it was still a bug, but we also realised the difference in how we experienced the same content. Between the way the headset was calibrated and the positioning and angle of the motion tracker; we realised there was a big difference between our ‘at rest’ eye level.

Through this bug we discovered the need to implement; not only a calibration process for users, but crucially a calibrated setup in the office. We need to be sure that both devs and tester are experiencing the same thing. Seeing the same thing is not enough!

We need to know that a turn of the head will react the same at either dev or tester’s workstation. Now obviously it’s unlikely that the separate workstations can be setup *exactly* the same, but realising the issue is key here!

The realisation gives us the opportunity to learn more, and how to give the user the highest quality VR experience they can get!

First thoughts – Testing with Oculus Rift

When I put on the headset for the first time; the immediate brightness instantly triggered my ‘design alarm’. Bad contrast and overly bright interfaces are one of my bugbears. It became apparent that it was going to be even more of an issue inside the headset.

It may seem obvious that overly bright interfaces would be worse in VR, but if it’s that obvious, why does it still happen on websites?

I noticed that line weights were dramatically reduced when viewed in Oculus Rift, rather than a monitor. This issue does connect to contrast. If your copy is rendering much thinner in a headset; it’s going to be very difficult to contrast between the copy and the surround.

I’ve been known to fuss a lot about contrast issues, but that’s because I believe it’s very important.

I fully support these people.

There are huge numbers of people with sight problems; both diagnosed and undiagnosed. If you present content that requires concerted effort of the user to read it; then you are isolating a big percentage of your possible audience.

Now extend this idea to VR.

If you create a product that alienates a fair percentage of your audience; they don’t decide to use another VR system. You’re not running a website where you may lose them to a competitor.

Alienating someone means they will most likely be lost to the world of VR. When you’re trying to present ‘the next big thing’ you need each and every person to go ‘WOW’.

If you make one person ‘WOW’ then they tell others, obviously the converse is true. You don’t lose one person to VR if they have a bad experience, you potentially lose more.

Good testing isn’t simply about pointing out issues with a user’s experience. It’s easy to say something has bad contrast and could be hard to read. It’s harder to see the knock-on effects that issue can cause. That’s where good testing comes in. The ability to see the problem and *potential* problems created by it.


Software testing with Oculus Rift

So recently I’ve had the opportunity to work with a company developing a media player for VR; most notably Oculus Rift.

As a user; putting on the headset does throw you into another world. As a tester that world is even more different. Not only are you coming to grips with what you’re seeing, but at the same time you’re questioning it all. Breaking down what you’re seeing into its respective aspects.

Questioning what you need to think about; that you never have before?

How do you take all your experience and apply it to this situation?

One of the first questions that came to me:

What’s more comfortable; to look up towards content or to look down?

This may seem like a facile question, but it’s deceptively important. One of the main differences with using VR; is the user’s ability to look around the virtual environment. The virtual environment needs to be purposed so the user can do this in a comfortable way.

So you’re in the headset, you move your head up and down. You use different rates of motions and different ranges. Then you look down far enough that your chin hits your chest. Now that is not a nice feeling! The sudden contact with your own body does disturb the immersivity.

But it’s not as straightforward as looking up is nicer than down.

If the content as you look up; is also too close to you. It can give a claustrophobic experience. Now that might be perfect within a specific section of a game, but it’s the last experience you want a user to have; when selecting from a wall of video content, for example.

I am going to write more articles on various aspects of testing with Oculus Rift. I also welcome any other testers, or developers with an interest in testing to contact me. Let’s talk about our experiences and build new heuristics that can form a good foundation to help testing with VR develop.

Going freelance

So no-one ever tells you all the stuff you have to do at the start when going freelance!

You always hear the cool stories and how someone’s life/work balance is much better now etc… etc…

So it’s quite a lot to take in but I’m making the move, so if anyone does want to hire me then get in touch and we’ll see how I can help!

Check the about tab for contact details.