Haptics and Virtual Reality

Kinesthetic Interfaces

The primary objective of these mechatronic interfaces is to enhance the virtual training simulations by providing the users with realistic force feedback that they would encounter during an operation. Specific mechatronic interfaces are developed based on the crucial haptic cues involved in the targeted operations.

Haptic Interface for Virtual Colorectal Surgeries

In colorectal surgeries, the interaction force/torque between the colonoscope and the colon tissue delivers important cues to surgeons. As part of the virtual training suite VESS, a 2-DOF haptic interface is designed to mimic the physical interactions that happen during the operations such as sub-mucosal injection, tissue cutting and navigating the tools.

This research aims to understand the haptic cues involved in colorectal surgeries and their efficiency in training surgical skills. As part of the project, another objective is to obtain the dynamics of the physical interaction between the surgical tool tip and the human colon tissue. The developed interface is attached to the tip of a colonoscope to provide realistic ergonomic conditions. The haptic cues are generated by two electrical motors that produce resistance in two directions: lateral and rotary motion.  

Haptic Interface

Kinesthetic feedback devices

These robotic mechanisms are used to provide realistic force feedback to users during virtual reality simulators. The kinesthetic force feedback is essential to creating the sensation of using surgical tools as it replicates the forces that one would encounter while interacting with patient anatomy. One of the kinesthetic devices we are creating is a mechanism to replicate a standard endotracheal intubation; where we must be able to provide motion in 4 degrees of freedom (DOF) and forces up to 45 newtons.

Kinesthetic Feedback Device
A 2-DOF custom haptic device which consists of a planar mechanism mounted vertically that can move up/down and in/out to mimic the motion during intubation.

Tactile Interfaces

Passive Tactile Display for Virtual Finger Palpation

Passive Tactile Display Based on MR Power Infused Elastic Surfaces

Success in a cricothyrotomy (CCT) highly depends on the surgeon’s ability in identifying the anatomical landmarks on human neck and making incisions with precision. Surgeons rely on subtle tactile cues on their fingertips to mark the incision site properly. The project VAST aims to develop a tactile interface that mimics the tactile feelings associated with palpating the human neck and reflect the changes in the tactile cues as the neck tissues are incised.

The interface consists of an elastic matrix infused with magnetic particles and an array of micro-coils to manipulate the particles in the matrix. The distribution of the magnetic field created by the actuator array defines the mechanical properties of the elastic matrix, turning it into a passive display. The future goal of this hardware is to simulate varying difficulty levels related to the CCT operation. Besides the development of the interface, this research aims to understand the underlying mechanisms of finger palpation on human skin. 

Immersive Virtual Reality and Self Awareness

Virtual Reality (VR)

Our simulators provide an immersive setting for various training exercises giving the users a new perspective using virtual reality (VR) headsets and custom haptics. Whether the environment is an operating room or even the inside of a patient, we aim to present information in a physically realistic and immersive manner.

VR applications are challenging as they require a different interface than desktop applications; users must feel that they have a presence in the virtual world and the virtual world needs to respond both with realism and in recognizable ways. All of this in real time.

Scripting and Visualization

For developing virtual reality applications, performance is critical, so compiled languages such as C and C++ are typically chosen. For our purposes, we have chosen to use the Visualization Toolkit (VTK) as it enables us to mix rendering of realistic environments and simulation data such as densities or currents. However, compiled languages can also substantially increase development time, especially in cases where the simulator’s logic may require multiple iterations before reaching the desired form. For this purpose, we built a framework around VTK using Racket, a Scheme-like language that allows us to develop and adjust scenes in a live environment, enabling faster prototyping and parameter tuning while keeping performance-critical sections, such as graphics and compute, in C++.

Another advantage of using Racket is its module system. Our instructional simulators present educational content through different activities. Each activity is developed as Racket module that can be activated and deactivated dynamically; each simulator groups a set of targeted activities and customization can be done very quickly by adding or removing modules at the top level.

Leap Motion

One of the interface challenges for VR is providing a sense of self-awareness. Using Leap Motion technology, we built an interface to allow users to not only see their hand movements reflected in virtual space, but also are able to directly interact with the virtual environment. Grabbing a tool and moving it in VR provides a familiar experience that helps craft the illusion of the being physically present in the simulator.