Skip to main content

Questions tagged [virtual-reality]

The tag has no usage guidance.

3 votes
0 answers
94 views

What vergence angle is commonly used for "infinitely far away" objects in stereoscopic displays?

I think that VR / AR experts will know the answer to this. At what vergence angle are "infinitely far away" objects such as star field textures drawn in head-mounted VR / AR displays ? I ...
Simon's user avatar
  • 193
-1 votes
1 answer
130 views

What stops the auto-generated 3D worlds of Google Earth Pro from being MUCH more detailed and accurate?

I'll admit it blew my mind when I first realized that you could actually enter a "first-person mode" in Google Earth Pro, and not just view the 3D maps from a floating camera in the air. But ...
C. Mollner's user avatar
0 votes
1 answer
29 views

Render Duplicate Miniature Scene Replica

Context I am working on project for implementing and user test a World-in-Miniature (WIM) interface for VR. A WIM essentially is a replica of the scene the user is currently in, but in miniature. ...
Radu - Andrei Coandă's user avatar
0 votes
1 answer
285 views

Calculate the position and rotation of a quad in 3d space given a 2d projection of that quad from a camera

I am trying to build a VR tracking system with a laptop webcam, and I have succeeded in identifying, and tracking paper markers I put in front of my webcam. For context I am using OpenCV with the ...
Djgaven588's user avatar
1 vote
1 answer
41 views

Real 3D interior room from 360 images

If I have 360 images of an interior room from all different angles, would it be possible with current 3D software technology to create a real 3D representation of that room, where the camera could ...
m.spyratos's user avatar
1 vote
1 answer
445 views

OpenGL Framebuffer with multiple Depthbuffers inside

I am trying to put multiple depthbuffers into one Framebuffer. I want to use VR and render both eyes at the same time: that means, in the geometry stage I want to clone the incoming triangle to two ...
Thomas's user avatar
  • 1,299
0 votes
1 answer
56 views

Are non manifold meshes problematic for Virtual Reality?

A simple question. I know that non-manifold meshes are problematic for physics simulations, boolean operations, and 3D printing. I was wondering if they can be problematic for a virtual reality ...
Rage's user avatar
  • 3
3 votes
2 answers
184 views

Where do computer graphic engineers look for job ads?

This is kind of a meta-question: We are a neuroscience lab looking for an engineer to develop virtual environments for experiments. Since it's not our usual field of job advertisement we do not know ...
fabee's user avatar
  • 141
2 votes
0 answers
212 views

Rendering Crystal Clear 3D Text in VR

Would it be possible to make text significantly more legible in VR? I know there are hardware limitations, but are there hanging fruit techniques that aren't being employed in 3D that are in 2D (...
George's user avatar
  • 253
7 votes
1 answer
359 views

How can I intercept and filter all frames coming out of SteamVR?

I'm trying to investigate the effect of certain image operations on how VR scenes are perceived. To do this, I'd like to run an off-the-shelf SteamVR application, capture the frames as they come out ...
Dan Hulme's user avatar
  • 6,840
4 votes
0 answers
68 views

Is it possible to create a forced focus with a dual layer of images in order to Avoid eye strain with Virtual and Augmented Reality?

Would eye tracking allow for the alignment of a dual layered image (a google glass like device and a screen or projection) to provide a way to force the focus of the eyes into to a real life ...
Ryan's user avatar
  • 41
6 votes
1 answer
474 views

Instanced Stereo Rendering vs. Multiple Command Buffers

Source In this webpage from Nvidia, the author(s) seems to imply that you could create a command buffer for each eye on separate threads. However, I don't see the benefit to this over instanced stereo ...
aces's user avatar
  • 1,353
4 votes
1 answer
2k views

Normal 2D photo to VR-compatible spherical photo

How to turn a normal photo like this one : into a photo that I can use in my game development platform (Unity3D). I just wrap a 3d sphere model with the photo and then it can be all around me when I ...
Nani's user avatar
  • 41
3 votes
1 answer
844 views

What is the definition of "motion to photon" in VR?

The rough definition of "MTP" is clear. But the exact one is not. You can refer to these links: What is Motion-To-Photon Latency? | chioka.in Motion-to-photon latency | xinreality.com/wiki. But from ...
Hao Zhang's user avatar
  • 109
0 votes
1 answer
563 views

Distributed parallel rendering in Gaming or VR rendering

VR rendering needs a lot of GPU power. VR SLI can help that. But, is it possible for us to use distributed parallel rendering technology to improve the performance dramatically? There is an open ...
Hao Zhang's user avatar
  • 109
1 vote
1 answer
178 views

Using GPU in PC for GearVR

Usually GearVR + SamsungPhone are used both for rendering and displaying but I would like to use it only for displaying when it is plugged to PC. The reason is to get performance similar to Oculus ...
MykolaSharhan's user avatar
8 votes
2 answers
297 views

What methods/technologies to reduce required performance for virtual reality are there?

I'm interested in virtual reality, but according to some sources, less than 1% of computers in use today have the necessary performance to run modern VR games, granted many of them are not intended ...
Syzygy's user avatar
  • 183
1 vote
1 answer
697 views

Can Frameless rendering reduce latency? And, can FPGA do 3D rendering instead of GPU?

In virtual reality, the motion-to-photon time is very important. Oculus says it has to be less than 20ms. Maybe 10~15ms is better. Some people try to introduce Frameless rendering technology into VR. ...
Hao Zhang's user avatar
  • 109
4 votes
5 answers
1k views

Cloud based VR would be the future?

According to the following assumptions, I think Cloud based VR would be the future: 1) Real world scene rendering needs ray tracing or other complex computing graphics technologies. It needs a lot of ...
Hao Zhang's user avatar
  • 109
2 votes
0 answers
168 views

Converting from 360 degree vr to rectangular

What I am interested in doing is taking a 360 degree stereo video (Oculus Rift etc), and a description of how the viewing direction changes with time (rather than reading from gyroscopes) and then ...
John Allsup's user avatar
19 votes
4 answers
4k views

How is VR different from a monitor

Apparently Macintosh computers cannot handle the Oculus Rift, because of their 'inferior' graphics cards. But should VR not just be like an external monitor? And concerning computer graphics, how are ...
Hans's user avatar
  • 293
2 votes
0 answers
214 views

Free 3D Scene Simulation Framework [closed]

I want to visualize a scene with simple shapes (e.g. boxes, pyramides, etc.) and I am looking for a tool or framework in C/C++ to do this. Focus of the application is to parameterize the relative ...
dcfyg's user avatar
  • 21
3 votes
1 answer
2k views

VR stereo rendering with Instancing

I have been reading this paper by Timothy Wilson Fast Stereo Rendering for VR and it would suit our game engine (DirectX 11) to use this method of stereo rendering. I have managed to get the game ...
Garry Wallis's user avatar
14 votes
3 answers
3k views

VR and frustum culling

When rendering VR ( stereo view ) environments, do you guys recommend just doing 2 frustum checks for determining what is to be drawn during the frustum culling pass or is there some other check that ...
Garry Wallis's user avatar
16 votes
3 answers
2k views

What is "Scanline Racing"

I've heard a lot of people working on VR talk about scanline racing and that it's supposed to help improve latency for motion-to-photon. However, it isn't clear to me how this can be done with OpenGL....
Mokosha's user avatar
  • 1,144