Browsing by Subject "Surgical Robotics"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Augmenting Surgical Robot Interactions with Intelligent Autonomy(2019-05) Stephens, TrevorHumans and machines do not merely coexist, but have evolved to collaborate. This combined human-machine cooperation when applied in surgical applications is coined computational surgery, and is an increasingly important aspect in health care. The myriad of sensors equipped on surgical tools, along with the increase in computing power creates opportunities to improve the care provided to patients. To provide this improved level of care it is necessary to create proper algorithms which utilize the sensory data. This work focuses on utilizing this sensory data by creating more accurate models, providing improved estimation, and demonstrating reliable control in various applications within computational surgery involving physical manipulation of tissues. A major contribution is a presentation of grip force and jaw angle estimation of da Vinci surgical tools providing an average error of 0.71 mNm and 0.08 degree under best-case conditions. Further conditions are tested and the impacted accuracy is reported including a comparison between tools, across wide frequency and torque ranges, across the life of a tool, and with varying roll, pitch, and yaw angles. Another key contribution is the derivation of an adaptive impedance controller that regulates tool-tissue interaction forces without utilizing direct force sensing. The controller is Lyapunov stable and estimation of tissue parameters converge asymptotically as verified by simulation results. This work concludes with two motivating examples of leveraging sensory data for these computational surgery applications, which includes a clinical excimer laser atherectomy procedure and robotic tissue grasping, with a stronger focus on the latter. The entirety of this work is a compilation of seven accepted or submitted conference and journal publications along with additional supporting material.Item A fast, low-cost, computer-vision based approach for tracking surgical tools(2013-08) Dockter, Rodney LeeThe number of Robot-Assisted Minimally Invasive Surgery (RMIS) procedures has grown immensely in recent years. Like Minimally Invasive Surgeries (MIS), RMIS procedures provide improved patient recovery time and reduced trauma due to smaller incisions relative to traditional open procedures. Given the rise in RMIS procedures, several organizations and companies have made efforts to develop training tasks and certification criteria for the da Vinci robot. Each training task is evaluated with various quantitative criteria such as completion time, total tool path length and economy of motion, which is a measurement of deviation from an `ideal' path. All of these metrics can benefit greatly from an accurate, inexpensive and modular tool tracking system that requires no modification to the existing robot. While the da Vinci uses joint kinematics to calculate the tool tip position and movement internally, this data is not openly available to users. Even if this data was open to researchers, the accuracy of kinematic calculations of end effector position suffers from compliance in the joints and links of the robot as well as finite uncertainties in the sensors. In order to and an accurate, available and low-cost alternative to tool tip localization, we have developed a computer vision based design for surgical tool tracking. Vision systems have the added benefit of being relatively low cost with typical high resolution webcams costing around 50 dollars. We employ a joint geometric constraint - Hough transform method for locating the tool shaft and subsequently the tool tip. The tool tracking algorithm presented was evaluated on both an experimental webcam setup as well as a da Vinci Endoscope used in real surgeries. This system can accurately locate the tip of a robotic surgical tool in real time with no augmentation of the tool. The proposed algorithm was evaluated in terms of speed and accuracy. This method achieves an average 3D positional tracking accuracy of 3.05 mm and at 25.86 frames per second for the experimental webcam setup. For the da Vinci endoscope setup, this solution achieves a frame rate of 26.99 FPS with an average tracking accuracy of 8.68 mm in 3D and 11.88 mm in 2D. The system demonstrated successful tracking of RMIS tools from captured video of a real patient case.