Humans and machines do not merely coexist, but have evolved to collaborate. This combined human-machine cooperation when applied in surgical applications is coined computational surgery, and is an increasingly important aspect in health care. The myriad of sensors equipped on surgical tools, along with the increase in computing power creates opportunities to improve the care provided to patients. To provide this improved level of care it is necessary to create proper algorithms which utilize the sensory data. This work focuses on utilizing this sensory data by creating more accurate models, providing improved estimation, and demonstrating reliable control in various applications within computational surgery involving physical manipulation of tissues. A major contribution is a presentation of grip force and jaw angle estimation of da Vinci surgical tools providing an average error of 0.71 mNm and 0.08 degree under best-case conditions. Further conditions are tested and the impacted accuracy is reported including a comparison between tools, across wide frequency and torque ranges, across the life of a tool, and with varying roll, pitch, and yaw angles. Another key contribution is the derivation of an adaptive impedance controller that regulates tool-tissue interaction forces without utilizing direct force sensing. The controller is Lyapunov stable and estimation of tissue parameters converge asymptotically as verified by simulation results. This work concludes with two motivating examples of leveraging sensory data for these computational surgery applications, which includes a clinical excimer laser atherectomy procedure and robotic tissue grasping, with a stronger focus on the latter. The entirety of this work is a compilation of seven accepted or submitted conference and journal publications along with additional supporting material.