[FoRK] InfinitEye 210 Degree HMD Technical Q&A: How does High FOV Virtual Reality Work?
eugen at leitl.org
Wed Nov 27 04:55:14 PST 2013
InfinitEye 210 Degree HMD Technical Q&A: How does High FOV Virtual Reality
November 26, 2013 by Paul James Leave a Comment
We put your questions to the people behind the virtual reality HMD with a 210
degree field of view. Here are team InfinitEye’s answers, exclusive to Road
Your Questions: Answered.
You may remember, a few weeks ago I headed out to Toulouse, France to meet
with Stephane, Lionel and Robin—the team behind the virtual reality HMD which
offers a staggering 210 degree field of view. Team InfinitEye were gracious
hosts, and in addition to showing me around their beautiful city they were
kind enough to show me what a panoramic VR experience feels like. I was
impressed. Not only had this small, dedicated team of friends designed and
built this amazing device from scratch, they’d also largely written their own
software, APIs and produced their own bespoke demos. To call them dedicated
and passionate is to undersell them somewhat, they’re obsessed!
I was the first person in the world outside this tight-knit group to
experience the device; unsurprisingly people have questions about it. Before
I embarked on my trip I asked you, the readers, what you’d like to ask the
team. So, the team have taken time out of their hectic schedule to answer
them in detail, so that Road to VR readers can get your heads behind the
device so to speak. Let’s get started.
Road to VR: Describe the components you use to construct the InfinitEye and
their individual costs.
InfinitEye: We use two 7″ HD screens (1280×800) 200$, 4 high quality Fresnel
lenses (2 per eye) 20$, a 3 DOF tracker from YEI 99$, expanded PVC sheets for
the casing ~10$, black paint 5$, mount stuff ~25$, cables ~12$.
Road to VR: What’s the vertical FOV of the device?
InfinitEye: Vertical FOV is ~90°. We designed our prototypes to have the
vertical center of the screens shifted a little towards the ground in order
to have a more natural vertical FOV. As the human vision you can see more in
the lower part than in the upper part.
Road to VR: How much does the current prototype weigh and what are its
InfinitEye: We have 2 different designs for our prototypes (see image at top
of this page):
The initial version with an elastic band is ~400g total with the cables. The
current prototype is 354g for the headset, ~200g more for the head mount +
aluminum bar and screws + top strap band + cables etc. Total weight is 553g
total. Note that those are only early designs which are intended to be
Road to VR: What tracker is used and what are its specifications?
InfinitEye: YEI 3-space sensor embedded, we use it in streaming data mode
with a refresh rate of 1000Hz. It features 3 axis gyros, accelerometers and
magnetometers and gives an accurate absolute orientation.
Road to VR: Have you considered any solutions for positional tracking?
InfinitEye: Yes, but it’s too soon for our team to disclose anything on the
Road to VR: Can you go into more detail on the lens assembly (i.e. stacked
Fresnel) and why you need 2 lenses per eye?
InfinitEye: We have to use 2 stacked Fresnel lenses to get a sufficient
magnification. We couldn’t find retail lenses with a short enough focal
length with a large enough size to fit the InfinitEye design. However we hope
that we will be able to manufacture custom lenses. It would reduce the
weight, improve the quality (less smearing) and hopefully reduce the cost.
Road to VR: Describe the breakout / junction box and connectivity to the
InfinitEye from it (i.e. how many cables, is USB / power sent up spare pins
on the HDMI cables etc.)
InfinitEye: The breakout board takes 2 HDMI, a 12V power and USB as inputs
then send the different signals and power to the 2 screens and tracking data
through very thin HDMI cables. It’s a temporary solution which allows us to
have low weight cables but we will certainly have custom cables (maybe one
instead of two).
infiteye-breakout-front infiniteye-breakoutbox Road to VR: What materials
have you used to construct the chassis / casing? InfinitEye: Expanded PVC,
2mm thick, completely water resistant and not as fragile as people might
think (from the manufacturer data sheet, impact resistance is 15kj/m²). It’s
not suitable for a finished product though.
Road to VR: What material might you use for a future, consumer version
casing? How much might that weigh?
InfinitEye: We have different ideas to keep the casing very light but we need
to perform some tests so we prefer not to disclose anything for the moment.
Road to VR: What are the minimum specifications for PCs to run games on the
InfinitEye: A mid-range PC from 2012 should be enough but of course it
depends on the game you want to play. For your information, our current setup
is a core i7 CPU with a GTX 660 Nvidia graphics card and it works perfectly.
The graphics card must have at least 2 HDMI/DVI outputs or 1 DisplayPort
output (to be used with a specific device that splits the signal into 2 HDMI
Road to VR: What have you used for the head strap / harness? What might you
use in a future, consumer model?
InfinitEye: The current prototype use a strap scavenged from a face shield
made for gardening. An elastic band should be cheaper and help reduce the
cost of the consumer product.
infiniteye-rear infiniteye-top Road to VR: The unit is currently open around
the eyes, what designs are you considering to resolve this? InfinitEye: The
open design is great because your face skin can breathe and there is no fog
on the lenses. We are considering making a removable light blocking system
but since we might not keep this design this solution could become
irrelevant. Note that we had a light blocking system on the first prototype
which had an elastic band. We are in a stage where we are trying different
designs to see what would be the best in terms of weight and comfort.
Road to VR: Describe the process of rendering a 210 degree stereoscopic image
for use with the InfinitEye.
InfinitEye: At the present time, we are using 4 renders, 2 per eye. Left eye
renders are centered on left eye, the first render is rotated 90° left and
the second looks straight ahead, building two sides of a cube. Right eye
renders are centered on its position, the first is rotated 90° degree right
and the second looks straight ahead, two sides of another cube. We then
process those renders in a final pass, building the distorted image.
Road to VR: What demos / software currently work with the InfinitEye and what
engines / frameworks do they use?
InfinitEye: For now we have 3 demos and they use our custom engine. We plan
to build a low level SDK in the near future, and then build support for
popular engines based on it.
Road to VR: Would the SDK for the InfinitEye be an open source offering?
InfinitEye: …we plan to open source it, but only after our crowd funding
campaign if we make it to this point.
Road to VR: Are you targeting any engines or middleware specifically? How
about APIs such as DirectX, OpenGL or AMDs Mantle?
InfinitEye: Ideally we would like to target all APIs and middleware that
allow rendering. That’s why we plan to build a low level SDK that can be
easily integrated by engines and middleware. DirectX and OpenGL are the
lowest level of accelerated rasterization, our SDK will target these,
providing the functions to correctly setup the cameras and then perform the
final pass, merging the renders and performing distortions. Mantle is not
available yet, we’ll consider it when we’ll get more information about it.
Road to VR: How simple do you think it might be for engines such as Unity to
include support for the InfinitEye? Could plugins be used by developers to
InfinitEye: We’re currently looking into this. We have little knowledge of
these engines but in the end the process of rendering frames is known and
well defined. We need to find the right entry points to call our future SDK
in order to output content for the InfinitEye.
Road to VR: What are the overheads like for rendering such a wide Field of
InfinitEye: Overheads are primarily for the graphics card. Wider FOV means
more visible objects at the same time. Most of the games we play only display
a 60° FOV by default. More objects to display means more vertices to process,
more pixels to compute. The performance drop is difficult to predict, but we
certainly don’t expect a linear performance decrease related to the FOV
Road to VR: Do you foresee game design challenges rending in 210 degree FOV?
InfinitEye: Not really, there should not be specific challenges related to
the 210° FOV. On the contrary, a panoramic field of view offers more
possibilities to VR game creators, for example they could add effects in the
peripheral vision to improve the experience. A horror game with ghost
appearances in the peripheral vision could make your heart skip a beat!
Road to VR: How do you intend to generate interest in your product and in
particular how do you think you can get developers on board?
InfinitEye: Like every other hardware equipment, with quality content running
on it. That’s why we have to make a development kit and have it in as many
developers’ hands as we can. A crowd funding campaign might be the way to go.
Road to VR: What software correction routines do you have currently and which
ones are planned? Are there particular challenges rendering for Fresnel
InfinitEye: For now we’re only addressing geometric aberration, we have tried
different distortion algorithms, and in our demos we’re currently using an
experimental fisheye distortion. It works quite well but we have plans to
improve it further. Chromatic aberration is the other problem we have to
address. It’s not a show stopper and we definitely have plans to address it.
Road to VR: Concerns have been raised over the syncing of frames between your
two displays, in particular on Windows based systems. Have you seen any such
artifacts? Is there anything in your rendering routine that combats this?
InfinitEye: We have never experienced this issue. From a software point of
view we’re only displaying to a single 2560 x 800 frame buffer. Display
synchronization is up to the graphic driver.
Road to VR: How did you correct the image for rendering, with a distortion
shader like in the Oculus SDK or did you try to model a ray traced solution
taking into account the surface equation and refractive index of the Fresnel
InfinitEye: We use a distortion shader, as Oculus does. As our main content
source is rasterization (OpenGL, DirectX) this is the way to go. When we’ll
have ray traced content then yes we’ll perform ray perturbations to account
for lens aberrations, but we don’t have any at the moment.Misc / Community
Road to VR: Do you have any upcoming competitions / shows you’d like to talk
InfinitEye: We participated in the finals of the French contest #cent1projets
on November 18th in Paris. The prize is not high but it would still be
helpful for us. We are still waiting for the results. We will attend the
Inition’s VR meet-up in London on December 12th and 13th. We will meet the
press and a lot of developers, it’s going to be great exposure for us.
Road to VR: What are your plans for funding the project? Is crowd-funding on
the agenda? If so, when?
InfinitEye: We are definitely considering crowd-funding as an option for
funding our project. We still have quite a few things to do before launching
such a campaign but Q1 2014 could be a good fit.
Road to VR: If you were to guess at a target retail price for the consumer
version of the InfinitEye, what might it be?
InfinitEye: That’s a tricky question, possibly $400 but it’s still too early
to give an accurate estimation. Be assured that we’ll do our best to propose
the lowest possible price.
Road to VR: Do you think there is enough room in the marketplace for multiple
VR HMDs? Do you see your product as somehow a premium option?
InfinitEye: In every market there should be enough space for different
solutions. If games are developed with VR in mind, it should easy to make
them compatible with different VR HMDs so everything is possible. I don’t
know if the InfinitEye would be a premium option, let’s just say an
alternative. A lot of actors are working on improving the VR experience, so
we don’t know what will happen next.
Road to VR: If you had a chance to collaborate with Oculus on a project,
perhaps the InfinitEye, would you do it?
InfinitEye: Why not? They definitely have a great and impressive team.
Road to VR: What displays have you considered for the next version of the
InfinitEye: Two 7″ 1920×1200 panels would be nice, but we also have other
ideas that we prefer not to disclose for now.
FredZ posted a stream of excellent questions in our comments—most have been
answered above—the remainder are below:
FredZ: Will the final version still use the dual 7″ displays or will you use
smaller panels, in order to get a smaller form factor and a more appealing
look and feel for the HMD ?
InfinitEye: Same as above, we are considering using different panels but it’s
too early to talk about this.
FredZ: Did you have a try at supporting 1080p or 1200p MIPI panels, since
building HDMI to MIPI boards seems to be feasible these days?
InfinitEye: Not yet, but we are looking into this.
FredZ: Do you intend to go wireless, and with which technology for the
display and the tracker?
InfinitEye: It’s not our priority but we have a few ideas and contacts to
achieve this, but nothing to talk about.
Road to VR: What are your thoughts on the current Oculus Rift Dev Kit (DK1)?
InfinitEye: It’s definitely a great device and we were very impressed when we
first tested it. Still, we think that a wide FOV is a key factor in gaining
more immersion, we are looking forward to seeing what will be the DK2.
Our thanks go out to Stephane, Lionel and Robin for taking time out of their
hectic schedule to get these answers to us. Remember, you can connect and
stay in touch with the team over at their Facebook page. I’ll be catching up
with InfinitEye and the team when I attend the forthcoming London Virtual
Reality Meetup coming up on December 12th. I look forward to being wowed by
high FOV VR all over again.
More information about the FoRK