Contenido principal del artículo

Irene Rivas Blanco
Universidad de Málaga
España
Eva Góngora Rodríguez
Universidad de Málaga
España
Carmen López-Casado
Universidad de Málaga
España
Manuel Caballero Roldán
Universidad de Málaga
España
Núm. 45 (2024), Bioingeniería
DOI: https://doi.org/10.17979/ja-cea.2024.45.10924
Recibido: jun. 5, 2024 Aceptado: jul. 8, 2024 Publicado: jul. 12, 2024
Derechos de autor

Resumen

La automatización de tareas quirúrgicas representa un campo de investigación en auge. En las últimas décadas, la integración de la robótica y la inteligencia artificial en los entornos quirúrgicos ha mostrado un gran potencial para mejorar la precisión, eficiencia y seguridad de los procedimientos quirúrgicos. La capacidad de los sistemas para realizar tareas repetitivas con gran precisión y sin fatiga, combinada con su capacidad para procesar y analizar grandes volúmenes de datos en tiempo real, ofrece oportunidades sin precedentes para transformar la práctica quirúrgica.
En este trabajo se pretende dar un paso más en el campo de los robots quirúrgicos autónomos desarrollando un asistente robótico para el aspirado automático de sangrado durante una intervención laparoscópica. Para ello se ha desarrollado un algoritmo de detección de sangrado basado en una red neuronal convolucional. Además, se ha automatizado un aspirador quirúrgico convencional, de manera que pueda realizar su función de forma automática acoplado al efector final de un brazo robótico.

Detalles del artículo

Citas

Attanasio, A., Scaglioni, B., Leonetti, M., Frangi, A. F., Cross, W., Biyani, C. S., Valdastri, P., 2020. Autonomous Tissue Retraction in Robotic Assis- ted Minimally Invasive Surgery - A Feasibility Study. IEEE Robotics and Automation Letters 5 (4). DOI: 10.1109/LRA.2020.3013914

Casals, A., Amat, J., Laporte, E., 1996. Automatic Guidance of an Assistant Robot in Laparoscopic Surgery. IEEE International Conference on Robotics and Automation Minneapolis, 895–900.

Chen, J., Lau, H., Xu, W., Ren, H., 2 2016. Towards transferring skills to flexible surgical robots with programming by demonstration and reinforcement learning. In: 2016 Eighth International Conference on Advanced Computational Intelligence (ICACI). IEEE, pp. 378–384. DOI: 10.1109/ICACI.2016.7449855

Estebanez, B., del Saz-Orozco, P., García-Morales, I., Muñoz, V. F., 4 2011. Interfaz multimodal para un asistente robó´tico quirúrgico: uso de reconocimiento de maniobras quirúrgicas. Revista Iberoamericana de Automática e Informática Industrial RIAI 8 (2), 24–34. DOI: 10.1016/S1697-7912(11)70023-1

Fu, Y., Robu, M. R., Koo, B., Schneider, C., van Laarhoven, S., Stoyanov, D., Davidson, B., Clarkson, M. J., Hu, Y., 2019. More unlabelled data or label more data? a study on semi-supervised laparoscopic image segmentation. In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 11795 LNCS. DOI: 10.1007/978-3-030-33391-1

Gao, X., Jin, Y., Dou, Q., Heng, P. A., 2020. Automatic Gesture Recognition in Robot-assisted Surgery with Reinforcement Learning and Tree Search. In: Proceedings - IEEE International Conference on Robotics and Auto- mation. DOI: 10.1109/ICRA40945.2020.9196674

Garcia-Peraza-Herrera, L. C., Li, W., Fidon, L., Gruijthuijsen, C., Devreker, A., Attilakos, G., Deprest, J., Poorten, E. V., Stoyanov, D., Vercauteren, T., Ourselin, S., 2017. ToolNet: Holistically-nested real-time segmentation of robotic surgical tools. IEEE International Conference on Intelligent Ro- bots and Systems 2017-Septe, 5717–5722. DOI: 10.1109/IROS.2017.8206462

Hwang, M., Wang, D., Jiang, W. C., Pan, X., Fu, D., Hwang, K. S., Ding, K., 2019. An Adaptive Regularization Approach to Colonoscopic Polyp Detection Using a Cascaded Structure of Encoder–Decoders. International Journal of Fuzzy Systems 21 (7). DOI: 10.1007/s40815-019-00694-y

Kassahun, Y., Yu, B., Tibebu, A. T., Stoyanov, D., Giannarou, S., Metzen, J. J. H., Vander Poorten, E., 4 2016. Surgical robotics beyond enhanced dex- terity instrumentation: a survey of machine learning techniques and their role in intelligent and autonomous surgical actions. International Journal of Computer Assisted Radiology and Surgery 11 (4), 553–568. DOI: 10.1007/s11548-015-1305-z

Mikada, T., Kanno, T., Kawase, T., Miyazaki, T., Kawashima, K., 2020. Su- turing support by human cooperative robot control using deep learning. IEEE Access 8. DOI: 10.1109/ACCESS.2020.3023786

Nguyen, N. D., Nguyen, T., Nahavandi, S., Bhatti, A., Guest, G., 2019. Ma- nipulating soft tissues by deep reinforcement learning for autonomous ro- botic surgery. In: SysCon 2019 - 13th Annual IEEE International Systems Conference, Proceedings. DOI: 10.1109/SYSCON.2019.8836924

Noonan, D., Mylonas, G., Shang, J., Payne, C., Darzi, A., Yang, G.-Z., 9 2010. Gaze contingent control for an articulated mechatronic laparoscope. In: 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics. IEEE, pp. 759–764. DOI: 10.1109/BIOROB.2010.5628078

Osa, T., Sugita, N., Mamoru, M., 2014. Online Trajectory Planning in Dyna- mic Environments for Surgical Task Automation. Robotics: Science and Systems (RSS).

Pedram, S. A., Ferguson, P., Ma, J., Dutson, E., Rosen, J., 5 2017. Autono- mous suturing via surgical robot: An algorithm for optimal selection of needle diameter, shape, and path. In: 2017 IEEE International Conference on Robotics and Automation (ICRA). IEEE, pp. 2391–2398. DOI: 10.1109/ICRA.2017.7989278

Perez, R. E., Schwaitzberg, S. D., 5 2019. Robotic surgery: finding value in 2019 and beyond. Annals of Laparoscopic and Endoscopic Surgery 4 (0), 51–51. DOI: 10.21037/ALES.2019.05.02

Pérez del Pulgar, C., 2015. Smart navigation in surgical robotics. Ph.D. thesis, Universidad de Málaga.

Petscharnig, S., Schoffmann, K., Benois-Pineau, J., Chaabouni, S., Keckstein, J., 2018. Early and Late Fusion of Temporal Information for Classification of Surgical Actions in Laparoscopic Gynecology. In: Proceedings - IEEE Symposium on Computer-Based Medical Systems. Vol. 2018-June. DOI: 10.1109/CBMS.2018.00071

Rabbani, N., Seve, C., Bourdel, N., Bartoli, A., 2022. Video-based Computer-aided Laparoscopic Bleeding Management: a Space-time Memory Neural Network with Positional Encoding and Adversarial Domain Adaptation. In: Proceedings of Machine Learning Research. Vol. 172.

Raymer, E., MacDermott, , Akinbi, A., 2023. Virtual reality forensics: Fo- rensic analysis of Meta Quest 2. Forensic Science International: Digital Investigation 47. DOI: 10.1016/j.fsidi.2023.301658

Rivas-Blanco, I., Lopez-Casado, C., Perez-del Pulgar, C. J., Garcia-Vacas, F., Fraile, J. C., Munoz, V. F., 4 2018. Smart Cable-Driven Camera Robotic Assistant. IEEE Transactions on Human-Machine Systems 48 (2), 183–

DOI: 10.1109/THMS.2017.2767286

Rivas-Blanco, I., Perez-Del-Pulgar, C. J., Garcia-Morales, I., Munoz, V. F., Rivas-Blanco, I., 2021. A Review on Deep Learning in Minimally Invasive Surgery. IEEE Access 9, 48658–48678. DOI: 10.1109/ACCESS.2021.3068852

Sarikaya, D., Corso, J. J., Guru, K. A., 2017. Detection and Localization of Robotic Tools in Robot-Assisted Surgery Videos Using Deep Neural Net- works for Region Proposal and Detection. IEEE Transactions on Medical Imaging 36 (7), 1542–1549. DOI: 10.1109/TMI.2017.2665671

Seita, D., Krishnan, S., Fox, R., McKinley, S., Canny, J., Goldberg, K., 2018. Fast and Reliable Autonomous Surgical Debridement with Cable-Driven Robots Using a Two-Phase Calibration Procedure. In: Proceedings - IEEE International Conference on Robotics and Automation. DOI: 10.1109/ICRA.2018.8460583

Setti, F., Oleari, E., Leporini, A., Trojaniello, D., Sanna, A., Capitanio, U., Montorsi, F., Salonia, A., Muradore, R., 2019. A Multirobots Teleoperated Platform for Artificial Intelligence Training Data Collection in Minimally Invasive Surgery. In: 2019 International Symposium on Medical Robotics, ISMR 2019. DOI: 10.1109/ISMR.2019.8710209

Singh, H., Modi, H. N., Ranjan, S., Dilley, J. W., Airantzis, D., Yang, G. Z., Darzi, A., Leff, D. R., 10 2018. Robotic Surgery Improves Technical Performance and Enhances Prefrontal Activation During High Temporal Demand. Annals of Biomedical Engineering 46 (10), 1621–1636. DOI: 10.1007/S10439-018-2049-Z/FIGURES/5

Stolzenburg, J.-U., Franz, T., Kallidonis, P., Minh, D., Dietel, A., Hicks, J., Nicolaus, M., Al-Aown, A., Liatsikos, E., 3 2011. Comparison of the FreeHand® robotic camera holder with human assistants during endoscopic extraperitoneal radical prostatectomy. BJU International 107 (6), 970–974. DOI: 10.1111/j.1464-410X.2010.09656.x

Twinanda, A. P., Shehata, S., Mutter, D., Marescaux, J., De Mathelin, M., Pa- doy, N., 2017. EndoNet: A Deep Architecture for Recognition Tasks on Laparoscopic Videos. IEEE Transactions on Medical Imaging 36 (1), 86–97. DOI: 10.1109/TMI.2016.2593957

Voros, S., Haber, G.-P., Menudet, J.-F., Long, J.-A., Cinquin, P., 12 2010. ViKY Robotic Scope Holder: Initial Clinical Experience and Preliminary Results Using Instrument Tracking. IEEE/ASME Transactions on Mechatronics. DOI: 10.1109/TMECH.2010.2080683

Wang, S., Raju, A., Huang, J., 2017. Deep learning based multi-label classifi- cation for surgical tool presence detection in laparoscopic videos. Procee- dings - International Symposium on Biomedical Imaging, 620–623. DOI: 10.1109/ISBI.2017.7950597

Yin, X. X., Sun, L., Fu, Y., Lu, R., Zhang, Y., 2022. U-Net-Based Medical Image Segmentation. DOI: 10.1155/2022/4189781