Stage M2: Reconaissance de Workflow chirurgicaux des hystérectomies assistées par robot
Stage · Stage M2 · 6 mois Bac+5 / Master LTSI (Laboratoire Traitement du Signal et de l'Image) équipe MediCIS · Rennes (France)
Date de prise de poste : 1 janvier 2024
Surgical robotic Deep Learning Workflow recognition.
Under a collaboration with a major company developing surgical robots and the obstetrical surgery department of Rennes University Hospital, we have access to a large and rare dataset composed of more than 80 hysterectomies. This dataset is composed of surgical video, robotic arms kinematics automatically recorded, and the surgical workflow description recorded manually which is time-consuming . To include new surgeries, it is necessary to develop an automatic workflow recognition method. For this, convolutional neural networks, recurrent neural networks, or transformers  could be used.
The objective of the development of such method is to increase the database in order to study surgical workflows specific to patient’s anatomical characteristics, surgical team’s habits or preferences. In previous studies, we have demonstrated the existence of such preferences on the procedural and kinematic aspects. We have highlighted sequence of activities  or kinematic performances  specific to the level of experience and also specific to each surgeon individually .
Objective of the internship
The objective of this work is to develop automatic workflow recognition deep-learning methods based on video and/or kinematic data. To achieve this goal, the internship will be divided into 3 steps:
- Study the state of the art on automatic workflow recognition;
- Develop automatic workflow recognition deep-learning methods-based video and/or kinematic data;
- Compare the performances between methods on surgical available data.
The candidate must have knowledge in deep learning, data analysis, computer science and programming (python).
Duration: 5 to 6 months
Salary or allowance: Standard internship allowances
 A. Huaulmé, F. Despinoy, S. A. H. Perez, K. Harada, M. Mitsuishi, and P. Jannin, “Automatic annotation of surgical activities using virtual reality environments”, Int. J. Comput. Assist. Radiol. Surg., June 2019.
 A. Huaulmé, P. Jannin et.al. “PEg TRAnsfer Workflow recognition challenge report: Does multi-modal data improve recognition?”, Computer Methods and Programs in Biomedicine, 2023.
 A. Huaulmé, S. Voros, L. Riffaud, G. Forestier, A. Moreau-Gaudry, and P. Jannin, “Distinguishing surgical behavior by sequential pattern discovery,” J. Biomed. Inform., 2017.
 A. Huaulmé, K. Nyangoh-Timoh, V. Jan, S. Guerin, and P. Jannin, “Global Versus Local Kinematic Skills Assessment on Robotic Assisted Hysterectomies”, Hamlyn Symposium, 2023.
 A. Huaulmé, K. Harada, G. Forestier, M. Mitsuishi, and P. Jannin, “Sequential surgical signatures in micro-suturing task,” Int. J. Comput. Assist. Radiol. Surg., vol. 13, no. 9, pp. 1419–1428, May 2018.
Procédure : Envoyer un mail à Arnaud Huaulmé: email@example.com
Date limite : 30 avril 2024
Offre publiée le 5 octobre 2023, affichage jusqu'au 30 avril 2024