HMD + Kinect = Augmented virtuality

HMD + Kinect = Augmented virtuality  
In this article I want to talk about the idea and Proof-Of-Concept of adding real-world objects to the Virtual Reality.
 
In my opinion, the idea described in the near future will be realized by all players of the VR-market. IMHO, the only reason why this has not been done so far is the desire to roll out the ideal solution, and this is not so easy.
 
drive.google.com/open?id=1dQrMLWzx72xB8CSa3W3kTNmBs4SDzmL1
 
We add it as an ordinary plugin in any UE project.
 
I did not understand how to connect a lib file using a relative path, so in OpenNI2CameraAndMesh.Build.cs we register the full path to OpenNI2.lib
 
Then place ADepthMeshDirect in the right place.
 
At level start, we call the startOpenNICamera method from UTools.
 
We do not forget that libfreenect2 is used to work with the kinetics, so the driver for the kinekt must be redefined on libusbK in accordance with the instructions on the libfreenect2 page
 
 
 
UPD:
 
At the beginning of the article, I said that such a system will soon be in all VR helmets. But in the process of writing, he somehow overlooked this moment and did not disclose it.
 
Therefore, I will quote my comment, which I wrote below for the disclosure of this topic:
 
If to speak - what for such system is necessary in all VR systems without an exception is safety.
 
Now the boundaries of the game zone are marked by a conventional cube.
 
But rarely any of us can afford to allocate an absolutely empty space for VR.
 
As a result, there are objects in the room, sometimes dangerous .
 
The main thing that I'm sure will be done is a virtual display of the entire room in front of the player in the form of a barely perceptible ghost, which on the one hand does not interfere with the perception of the game, but on the other - it allows not to stumble and not die.
 
 
P.S.
 
I want to express my great gratitude to ArPoint, whose management has allocated me the technical basis for working on this project.
+ 0 -

Add comment