Abstract
Tactile feedback in robot-assisted minimally invasive surgery (RAMIS) is crucial for surgeons when palpating subsurface tumors and other organ structures. The research presented here is a new approach for tactile sensation generation that aims to provide deformation and texture detection in RAMIS. The proposed solution comprises three phases: feature extraction, recognition and feedback. The feature extraction process is based on data acquisition from two micro-electromechanical systems (MEMS) sensors and a force-sensitive resistor (FSR) sensor attached to an EndoWrist thoracic grasper instrument compatible with the da Vinci Surgical System. The acquired data is processed using digital signal processing methods and utilized in the recognition phase. The recognition segment receives the features as inputs for training and testing two advanced machine learning algorithms. The first algorithm is a Reflex Fuzzy Min-Max Neural Network (RFMN); the other is a Time Series Classification - Learning Shapelets (TSC-LS) method. The machine learning algorithms aim to accurately recognize and classify physiological structures with different softness and roughness into a corresponding deformation or texture label. Lastly, a means of mechanically giving the labeled data as feedback to the surgeon via a visual-tactile display and a wearable device located on the surgeon's forearm is accomplished to mimic palpation feedback during RAMIS.
| Original language | English (US) |
|---|---|
| Journal | IEEE Transactions on Biomedical Engineering |
| DOIs | |
| State | Accepted/In press - 2025 |
Keywords
- machine learning
- object classification
- robotic-assisted surgery
- tactile feedback
ASJC Scopus subject areas
- Biomedical Engineering