After the robotCalibration plugin, we integrated the real-time point cloud streaming. A Kinnect (replaceable with other depth cameras) acts as the gesture recognizer, and point cloud streaming device. We use the point cloud tith the digital twin in a 1:1 scale, so the virtual reality scene matches the point cloud. This way we integrate reailty and virtual reality. For more info take a look on our Twitter page, and see the video. Get the code at ApertusVR git repo.
Senior software architect
As a relatively new kid on the block of AR/VR market, we got the opportunity to participate in the Alpha programme of Websummit 2017. This is a huge possibility for us to build new connections, and demonstrate the abilities of our programmer’s library. If You share our passion and goals all related to AR/VR technologies or see the potential in our developments You will have the opportunity to meet us in person from the 5th till the 9th of November in Lisbon on Websummit. We will exhibit for one day but will be there for the whole event.
As the developers of ApertusVR we are able to deliver fast completion times. Short the development cycle, reduced development costs, clear responsibility matrix. According to Your project, we are able to involve other colleges of the Institute. We can build up a flexible team of experts, meeting the requirements of the project we get contracted for.
As ApertusVR’s main purpose is to be used in Your products, it needs further developments according to the solutions, that make it a unique set of tools for You. To achieve this we offer these support packages for You, so you can cut Your development time.
Feature request priority
Private feature integration
1111 XI. distr. Kende str. 13-17