Selected Solution
The solution that we have selected involves the use of a virtual reality system like the Oculus Rift to control a robot remotely. The robot itself would have a sensor suite similar to LiDAR (light detection and ranging) that uses the reflection of light back to the sensor to measure the distance between the sensor and the object. Generating thousands of these reflected light points allows the sensor to create a point cloud, which becomes a 3D image for the user to view through the VR headset. The sensor suite also includes a camera that associates each point with the color of the object it represents. Cameras on the robot’s “hands” would provide further context for the operator.
Clarification of Need
In our initial proposal, we indicated that the design solution would benefit those working “dangerous duties.” We felt it was essential to clarify what these duties might entail and what the resulting uses for this system might be.
The most critical problem to note in the subject scenarios is the inherent hazards of the environment. We initially used the example of an Explosive Ordnance Disposal team. In this situation, the robot would take the place of the EOD technician, allowing the technician instead to control the robot using the Virtual Reality Neurorobotic system without the danger of explosion. Another potential use would be in toxic environments. These may consist of chemical or biological threats where a doctor or medic may need to reach an injured party without putting their own lives in danger. The medical professional’s expertise is still able to be used through our solution, potentially saving a life at no personal risk. This system could be used for exploratory and scientific purposes in extreme environments such as the deepest depths of an ocean, in the hottest or coldest temperatures, near a volcano, or even in the vacuum of space. The purpose is to allow a human-like level of responsiveness and skill while removing the operator from a potentially life-threatening situation.
Progress and Path Forward
As it stands currently, this design group is meeting expected timelines. With a selected solution and clarified statement of need, the group has moved on to building an executable design, developing cost estimates for production, and forming this into a final product presentation. This process is expected to take two more weeks.
Group Member Roles
- Findings and Evaluation of Ideas
References
Ibari, B., Ahmed-Foitih, Z., & Reda, H. E. A. (2015). Remote control of mobile robot using the virtual reality. International Journal of Electrical and Computer Engineering, 5(5), http://iaesjournal.com/online/index.php/IJECE/issue/archive. Retrieved from http://ezproxy.libproxy.db.erau.edu/login?url=https://search-proquest-com.ezproxy.libproxy.db.erau.edu/docview/1732577108?accountid=27203
LiDAR-UK. (2019). How does LiDAR work? Retrieved from LiDAR-UK: http://www.lidar-uk.com/how-lidar-works/
Rutkowski, T. M. (2016). Robotic and virtual reality BCIs using spatial tactile and auditory oddball paradigms. Frontiers in Neurorobotics, doi:http://dx.doi.org.ezproxy.libproxy.db.erau.edu/10.3389/fnbot.2016.00020