Around 2013-2014, I contributed an article with the title above to the Engineering Research News (ISSN 0217-7870) at the time. The theme was “The Changing Faces of ME”. Mechanical Engineering (ME) is multi-disciplinary.
Following is an edited version:
One key research area pursued by my group is intelligent surgical robotic systems, which augment and enhance the hand-eye coordination capability of the surgeon during operations to achieve the desired outcome and reduce invasiveness.
Hand-eye coordination refers to the ability of our vision system to coordinate and process the information received through the eyes to control, guide and direct our hands in accomplishing a given task. In this work, we studied hand-eye coordination to build a medical simulator for surgical training and to develop medical robot that duplicates the best surgeon’s hand-eye coordination skills.
Our research adopts an integrated view of surgical simulators and robot assisted surgery. The former is a simulation game for surgical training and treatment planning, while the latter involves a single or plurality of devices assisting the surgical team in precise patient operations. With computer simulator, a patient specific surgical plan can be derived with robot manipulation included. By combining patient-specific simulation with robotic execution, we can developed highly autonomous robot(s).
In an automated system, providing proper feedback is crucial to keep the human operator engaged in the decision-making process. Necessary visual, audio and haptic cues should be provided to the human operator in a timely manner, enabling swift intervention. The study on human centricity in an immersive and robot-assisted environment will provide unique insights on human hand-eye coordination capabilities under external influences.
A cognitive engine provides a high level of intelligence in the autonomous robot to be effective collaborator with human(s). The engine possesses knowledge about relevant aspects of surgery, including the dynamics of the surgery, the robot actions and the behavior of biological tissue in response to those actions. The actions of the surgical team contribute to the dynamics and, at times, introduce uncertainty to the operation. The self-learning process of the cognitive engine requires inherent knowledge of tissue biomechanics. Biological tissues within the human patient body cavity are living elements that may be preserved, repaired, or destroyed using mechanical and thermal methods.
Surgery can be planned with a virtual robot in a simulator with realistic biomechanical models, and then the procedure can be performed on the patient using the robot with the assistance of advanced man-machine interfaces. Augmented reality technologies with intelligent visual, haptic and audio cues will provide a medium for the surgical team to effective control the robot.
The figure depicting the architecture of an intelligent surgical robotic system with a cognitive engine, as mentioned in the original article is still a work-in-progress. Its latest version can be found in:
Tan, X, C B Chng, B Duan, Y Ho, R Wen, X Chen, K B Lim and C K Chui, “Cognitive engine for robot-assisted radio-frequency ablation system”, Acta Polytechnica Hungarica 14, no. 1 (2017): 129-145.
https://uni-obuda.hu/journal/Tan_Chng_Duan_Ho_Wen_Chen_Lim_Chui_72.pdf