Robot-assisted Training – Panel Discussion and Presentation in IROS 2020 and ICRA 2021 Workshops

I was a Technical Panel Speaker of RoPat20: IROS 2020 Workshop. IEEE IROS 2020 took place from Oct 25, 2020 to Jan 24, 2021, and was an On-Demand Conference.

In this first RoPat workshop titled “Robot-assisted Training for Primary Care: How can robots help train doctors in medical examinations?”, I briefly introduced my research on medical simulation and robot-assisted training for hand-eye coordination at the beginning of the panel discussion.

I began working on medical simulation in the 1990s through a research collaboration between a Singaporean publicly funded research institution and Johns Hopkins University in the US. Our aim was to develop a training simulator for interventional radiology, similar to a flight simulator for pilot training. Training a doctor to become a qualified interventional radiologist is a time-consuming process. Our focus was on providing realistic hand-eye coordination training for the trainees. We re-constructed the human vascular system from the Virtual Human project, and then modelled the interaction between the vessel wall, catheter and guidewire using finite element methods. Our simulator functions like a flight simulator or a computer game – it does not provide direct instruction.

About 10 years ago, together with my collaborators in NUH and A*Star research institutions in Singapore, we developed a robot trainer for laparoscopic surgery training. The robot would learn the motions from the master surgeon and guide the trainee to replicate them. The trainee could also freely perform the motions and compared them to the  master’s. We tested the VR-based training system with medical students and have since transitioned to an AR sytsem.

The slide below shows the FLS peg transfer setup. Currently, we are focusing on robot motion learning using deep reinforcement learning. We also do scene segmentation and workflow recognition.

The IROS 2020 RoPat workshop was a success. The 2nd RoPat workshop (RoPat 21) on Robot-Assisted Systems for Medical Training was organized in IEEE ICRA 2021 conference, held from May 30 to June 5, 2021, in Xi’an, China. It was a hybrid event.

This time I participated in the workshop as an invited technical speaker. I delivered a presentation on physical tissue modelling and augmented reality training in medical training online. I apologize for not being able to answer questions immediately after my oral presentation. If you have any questions regarding my talk, please feel free to email me.

Following is the title and abstract of my talk in IEEE ICRA 2021 RoPat.

Title: Medical Simulation and Deep Reinforcement Learning

Abstract: Medical simulation provides clinicians with real-time interactive simulations of surgical procedures to enhance training, pre-treatment planning, and the design and customization of medical devices. Robots are increasingly becoming integrated elements of surgical training systems. We propose a robot-assisted laparoscopy training system that extensively utilizes deep reinforcement learning (DRL). By combining exercises, demonstrations from human experts, and RL criteria, our training system aims to improve the trainee’s surgical tool manipulation skills. DRL plays a crucial role in modelling the interaction between biological tissue and surgical tools. Additionally, we explore the application of DRL in surgical gesture recognition. As a pathway to artificial general intelligence, DRL has the potential to transform traditional medical simulation into intelligent simulation.

AE-CAI | CARE | OR2.0 – Intelligent Cyber-Physical System for Patient-Specific Robot-Assisted Surgery and Training

Cyber-Physical Systems (CPS) are advanced mechatronic systems and more. In this 45-minute keynote lecture, I presented our work on CPS for robot-assisted surgical training and surgery. The intelligent CPS is a Cyber-Medical System that integrates medical knowledge to create smart, personalize surgery. Human centricity is extended to the interaction and collaboration with the robot(s) in the robot-assisted environment. To be an effective and efficient assistant to the human surgical team, the robot in the operating room should process intelligence. The intelligent CPS presents an opportunity to address the challenging tool-tissue interaction problems central to patient-specific surgery.

Cyber-medical system for patient-specific medical devices development

Above is the title of my invited talk in The 13th Annual IEEE International Conference on Nano/Micro Engineered and Molecular Systems (IEEE NEMS 2018), held in Singapore from April 22-26, 2018.

Below is the abstract of my talk:

With increasing demands for quality and affordable healthcare services, organizations in the medical device manufacturing industry are embracing more intelligent and responsive systems through the integration and development of dynamic digital technologies. We propose a Cyber-Physical System (CPS)-based production system with integrated enabling digital technologies and robot assistance. This system has the potential to enhance productivity and sustainability, particularly in the production of patient-specific hybrid medical devices. Hybrid medical devices incorporate multiple components and materials that must function flawlessly over extended periods, often under the demanding conditions of the human body. Examples of hybrid medical devices include artificial tracheas, artificial pancreases for diabetes treatment, and information-delivering microchips.

The proposed CPS-based manufacturing system utilizes an integration of enabling digital technologies, including Augmented Reality (AR), Wireless Sensor Networks (WSN), the Internet of Things (IoT), and Artificial Intelligence (AI), for the fabrication of patient-specific medical devices. Visual and haptic cues are provided to the human operator in a timely manner, allowing for swift intervention. Importance-driven computer graphical rendering of visual cues is embedded into physics-based simulations for haptic rendering.

The study of human centricity in an immersive and robot-assisted environment will provide unique insights into human hand-eye coordination capabilities under external influences. Intriguing scientific questions include the extent to which individuals can learn and develop motor skills with external guidance.

 

Note: There is an IEEE SMC Technical Committee (TC) on Cyber-Medical Systems. More information of the TC can be found here.

Robotic surgery: hand-eye coordination, cognition and biomechanics

Around 2013-2014, I contributed an article with the title above to the  Engineering Research News (ISSN 0217-7870) at the time. The theme was “The Changing Faces of ME”. Mechanical Engineering (ME) is multi-disciplinary.

Following is an edited version:

One key research area pursued by my group is intelligent surgical robotic systems, which augment and enhance the hand-eye coordination capability of the surgeon during operations to achieve the desired outcome and reduce invasiveness.

Hand-eye coordination refers to the ability of our vision system to coordinate and process the information received through the eyes to control, guide and direct our hands in accomplishing a given task. In this work, we studied hand-eye coordination to build a medical simulator for surgical training and to develop medical robot that duplicates the best surgeon’s hand-eye coordination skills.

Our research adopts an integrated view of surgical simulators and robot assisted surgery. The former is a simulation game for surgical training and treatment planning, while the latter involves a single or plurality of devices assisting the surgical team in precise patient operations. With computer simulator, a patient specific surgical plan can be derived with robot manipulation included. By combining patient-specific simulation with robotic execution, we can developed highly autonomous robot(s).

In an automated system, providing proper feedback is crucial to keep the human operator engaged in the decision-making process. Necessary visual, audio and haptic cues should be provided to the human operator in a timely manner, enabling swift intervention. The study on human centricity in an immersive and robot-assisted environment will provide unique insights on human hand-eye coordination capabilities under external influences.

A cognitive engine provides a high level of intelligence in the autonomous robot to be effective collaborator with human(s). The engine possesses knowledge about relevant aspects of surgery, including the dynamics of the surgery, the robot actions and the behavior of biological tissue in response to those actions. The actions of the surgical team contribute to the dynamics and, at times, introduce uncertainty to the operation. The self-learning process of the cognitive engine requires inherent knowledge of tissue biomechanics. Biological tissues within the human patient body cavity are living elements that may be preserved, repaired, or destroyed using mechanical and thermal methods.

Surgery can be planned with a virtual robot in a simulator with realistic biomechanical models, and then the procedure can be performed on the patient using the robot with the assistance of advanced man-machine interfaces.  Augmented reality technologies with intelligent visual, haptic and audio cues will provide a medium for the surgical team to effective control the robot.

The figure depicting the architecture of an intelligent surgical robotic system with a cognitive engine, as mentioned in the original article is still a work-in-progress. Its latest version can be found in:

Tan, X, C B Chng, B Duan, Y Ho, R Wen, X Chen, K B Lim and C K Chui, “Cognitive engine for robot-assisted radio-frequency ablation system”, Acta Polytechnica Hungarica 14, no. 1 (2017): 129-145.

https://uni-obuda.hu/journal/Tan_Chng_Duan_Ho_Wen_Chen_Lim_Chui_72.pdf

 

Medical simulation system for catheterization

This picture of me was taken in the 1990s when I was developing a medical simulation system for catheterization. In the photo, you can see that I hung catheters on the window panel next to the door of my office.

The simulation system was a new concept and a novel approach in providing physicians with a real-time interactive simulation of vascular catheterization procedures. Its purpose was to enhance training, as well as improve pretreatment planning and the design of medical devices. https://www.nlm.nih.gov/archive/20120612/research/visible/vhp_conf/chui/index.htm