Goals: The VR robotics group, as a whole, is out to build an interface with the HTC Vive VR system and a robot so that the robot will emulate the VR user’s movement.
Description: The project consists of three main parts. Those parts are, VR Software programming, Robot programming, and the interface between the two.
VR Software Side
Mainly, we use the SteamVR library in Unity. We use a human model in Unity to estimate the movement of the VR User. The Vive gives us the location of both of the user’s hands and their head because that is where the controllers are located. From this information, we can estimate the position of the user’s arm and neck movements through inverse kinematics. The main work on this VR software side is to refine the accuracy of the estimation of the body movements and preparing the human positions obtained to a form that the robot can understand.
Robot Programming
Overall, the struggle has been to make a robot that can emulate a human. We need to ensure that the robot can move in the main ways that a human can move. Previously, we used a robot called Darwin which had most of the movements that a human could do. However, this robot was older and continually broke. Therefore, we have currently been using a robotic arm and will continue to work until the arm works. Then we will proceed to build a robot closer to that of a human, specifically the upper body.
The interface between Vive and Robot
We created a protocol for communicating information from Unity to the robot so that it knows hows to move. Also, with the previous robot, we worked on a video streaming system that would allow the VR user to see the perspective of the robot.
From Darwin to Robotic Arm
Previously, the team used Darwin. Because of complications faced while programming it, the team decided to use a programmable arm instead.