77 patents in this list

Updated: July 18, 2024

High precision and efficiency in surgical procedures are essential for reducing patient risk and improving recovery outcomes. Traditional surgical techniques, while effective, are increasingly complemented by advanced technologies to enhance performance and accuracy.

This article examines the role of AI-driven robots in enhancing surgery assistance, an innovative approach that is transforming the surgical landscape.

With advancements in AI and robotics, we can now achieve unprecedented levels of precision, control, and efficiency in surgical procedures. These innovations are revolutionizing surgery by providing enhanced assistance to surgeons, reducing risks, and improving patient outcomes in the operating room.

1. Neural Network-Based Automatic Tool Disengagement in Surgical Robots

Verb Surgical Inc., 2024

Automatic disengagement of surgical tools from robotic arms when the user returns the hand controllers to their docks, without needing a foot pedal. A recurrent neural network classifier detects if the controller motions are docking or teleoperation. When docking is detected, the robot disengages the tools to prevent unintended movements. The classifier is trained on labeled time-series data of controller motions.

2. Machine Learning System for Analyzing Surgical Videos and Enhancing Surgical Assistance

Theator Inc., 2024

Using machine learning to analyze surgical videos and derive insights like identifying deviations from surgical planes, analyzing surgical competency, and generating statistical reports with links to video evidence. The system receives video frames from multiple surgeries, categorizes them based on events, aggregates stats within categories, and presents frames from categories. It also predicts future events in ongoing surgeries, suggests video reviews, and assigns surgeons to surgeries based on skill level and time estimates.

3. Machine Learning Models for Real-Time Surgical State and Instrument Detection

DIGITAL SURGERY LIMITED, 2024

Automatically detecting surgical states, instrument locations, and motion profiles from video during surgery using machine learning. The method involves training machine learning models to predict surgical states and detect instruments in video frames based on input data. It uses shared features between state and instrument detection to improve both tasks. The models can also leverage temporal information from state prediction to boost instrument detection. This allows real-time augmentation of surgical video with instrument overlays and motion profiles.

4. AI System for Real-Time Surgical Guidance and Risk Analysis

Orthogrid Systems Holdings, LLC, 2024

Artificial intelligence (AI) system for real-time surgical guidance during procedures like joint replacements, fracture reductions, and spine surgeries. The system analyzes intraoperative images to calculate surgical decision risks and provides automated guidance to optimize implant placement, fracture reduction, and alignment. It uses AI models trained on surgical data to identify anatomical landmarks, register images, and predict risks. The system displays visual guidance to surgeons, dynamically updating as the procedure progresses.

US20240206990A1-patent-drawing

5. AI-Enhanced Tissue Recognition for Autonomous Robotic Surgery Actions

Intuitive Surgical Operations, Inc., 2024

Using AI to enhance robotic surgery by leveraging image recognition to identify tissue types during surgery and allowing the robot to autonomously perform actions based on the identified tissue. The system involves obtaining images of the surgical site, identifying the tissue types using AI, and if the identified tissue matches a targeted type, having the robot remove it. This allows the robot to perform specific actions without manual guidance once the AI tissue recognition is confident. The AI can also help with tasks like incision placement, guide wire manipulation, and imaging output projection to improve accuracy and repeatability.

6. Ultrasound-Based Tracking for Robotic Surgical Instruments with Redundant Tracking Modalities

GLOBUS MEDICAL, INC., 2024

Tracking the position of a robotic surgical instrument using ultrasound (US) instead of optical markers. A US transducer is attached to the instrument and generates US images of the anatomy. The instrument's pose is determined by matching anatomical features in the US images to a template of the instrument. This allows accurate tracking even if the US transducer loses contact with the patient. The system switches to optical tracking if the US transducer lifts off. It also uses kinematics if the US transducer stops outputting. This provides redundancy and continuous tracking during instrument motion.

7. Machine Learning-Enhanced Robotic Surgical System for Improved Suturing Precision

Covidien LP, 2024

Robotic surgical system with enhanced tissue suturing guidance using visual feedback and machine learning. The system uses an imaging device to capture the position and orientation of the surgical needle inside the body during suturing. It estimates the needle tip location and trajectory from the images. This data is used to refine the robot's control signals to improve needle placement and avoid collisions. The system can also augment the video feed to highlight the needle and show its path. It can also check for tissue contact to guide the robot's motion.

8. Hierarchical Multi-Modal Position Tracking System for Surgical Applications

Dignity Health, 2024

Multi-modal position tracking system for accurately tracking objects in extended time periods for applications like surgery. The system uses hierarchical tracking sub-systems with parent, child, grandchild, etc. systems that record object positions relative to their virtual spaces. A mapping between the virtual spaces allows translating positions across the hierarchy. This provides multi-modal position estimation and verification by observing objects from multiple sub-systems and correcting them if needed. It enables accurate tracking over extended time for surgical applications by compensating for drift between sub-systems.

US20240164851A1-patent-drawing

9. Machine Learning and Computer Vision for Real-Time Surgical Assistance

DIGITAL SURGERY LIMITED, 2024

Using machine learning and computer vision to automatically detect critical anatomical structures and surgical instruments during live video of a surgical procedure. The technique involves training machine learning models to predict the presence and location of instruments and structures in the video frame using spatio-temporal information. The models can also predict the overall surgical state. The instrument and structure detection models share extracted features with the state prediction models to improve confidence. This allows real-time augmented visualization of the surgical view. The technique improves surgical safety and workflow by providing action guidance and alerts.

US20240161497A1-patent-drawing

10. Intraoperative Video Analysis for AI-Driven Surgical Flow Deviation Recommendations

ORTHOSOFT ULC, 2024

Computer-assisted surgery system that can recommend deviations from standard surgical flows based on intraoperative video analysis. The system uses a processing unit and memory to monitor a surgical procedure via video feed and detect conditions requiring deviations outside the standard flow. It trains a machine learning model using video and tracking data to perform tracking without the separate tracking device. This allows the system to provide intraoperative recommendations for deviations based on video analysis alone.

US20240148439A1-patent-drawing

11. AI-Powered Augmented Reality Guidance for Real-Time Surgical Assistance

Activ Surgical, Inc., 2024

Real-time augmented surgical guidance system using AI to provide predictive visualizations of critical structure locations and tissue viability during surgery. The system uses machine learning models trained on medical datasets to anticipate critical structures and tissue viability based on imaging and physiological data. It can also provide real-time guidance and decision support during surgical procedures. The AI analyzes imaging data from multiple modalities and extracts features to classify structures and assess viability. The system generates enhanced views with guidance and metrics to assist surgeons. It can also autonomously track deformable tissues during maneuvers. The AI is trained on whole surgery datasets with anatomical and physiological data.

12. Computer Vision-Controlled Surgical Tools for Enhanced Operation Safety

DIGITAL SURGERY LIMITED, 2024

Using computer vision to improve safety and reliability of surgical procedures by controlling the operation of surgical tools based on recognized objects in the video feed. The system trains a computer vision model to recognize surgical tools from live video. During surgeries, the model is used to detect surgical tools in the video feed. When a tool is detected, it enables control of the tool to perform its function. This ensures tools are only used when visible, preventing accidental activation outside the field of view.

US20240122734A1-patent-drawing

13. AI and Simulation-Based Optimization of Minimally Invasive Surgical Procedures

HUTOM CO., LTD., 2024

Optimizing surgery by using simulation and AI to find the best tools, procedures, and entry points for minimally invasive surgeries. The method involves generating genes representing surgical procedures, simulating them in virtual bodies, evaluating optimality, and applying genetic algorithms to derive improved procedures and instrument configurations. This provides optimized surgical cue sheets and robot designs that improve efficiency and convenience in actual surgery.

US11957415B2-patent-drawing

14. Real-Time Surgical Assistance through Graph Neural Network Analysis of Video Data

THE GENERAL HOSPITAL CORPORATION, MASSACHUSETTS INSTITUTE OF TECHNOLOGY, 2024

Using graph neural networks to interpret surgical video data and provide real-time feedback to surgeons during operations. The method involves extracting numerical features from sensor data like video to represent surgical concepts like anatomy, tools, and safety. These features are passed through graph neural networks with connections representing relationships between concepts. The networks learn to reason about the concepts during surgery and provide statistical parameters representing the state of the procedure. This information can be displayed to surgeons to assist decision making and mitigate risks.

15. AI-Generated Surgical Plans for Improved Medical Procedure Efficiency and Safety

Shanghai United Imaging Intelligence Co., Ltd., 2024

Automatically generating surgical plans in a medical environment using AI to relieve the burden on medical professionals and enhance safety, efficiency, and effectiveness of surgeries. The system uses image analysis to create patient and environment models from captured images. It then devises a surgical plan based on these models, including movement paths for medical devices, to execute procedures. The AI identifies people and objects in images, determines patient anatomy and environment layout, and generates optimized surgical plans.

US20240099774A1-patent-drawing

16. Machine Learning-Based Haptic Feedback Generation for Robotic Surgery Assistance

Verb Surgical Inc., 2024

Generating haptic feedback for robotic surgery to provide tactile sensations to the surgeon during procedures. The feedback is based on analyzing video of the surgery using machine learning models trained to correlate visual appearances of tool-tissue interactions with force levels. The models predict the force applied when the video is played back, and this is converted into haptic feedback sent to the surgeon's user interface. This aims to replicate the tactile feedback a surgeon would feel during open surgery when using robotic instruments.

US20240099555A1-patent-drawing

17. Augmented Reality Guidance System for Enhanced Surgical Accuracy and Workflow

Smith & Nephew, Inc., 2024

Enhancing surgical workflows using a computer-assisted surgical system that provides augmented reality guidance during surgeries. The system combines traditional surgical navigation with augmented reality to improve accuracy and reduce distractions. It uses specialized markers that can be tracked by both the surgical navigation system and augmented reality. This allows co-registration of the tracking frames for seamless integration of augmented reality data into the surgeon's view. The augmented reality display shows surgical plan information directly over the patient anatomy, eliminating the need for the surgeon to look away from the surgical site. It also allows using the same markers for both tracking systems, simplifying setup and avoiding conflicts. The augmented reality system can project a virtual screen in the surgeon's line of sight to display the GUI without requiring turning the head.

18. AI-Enhanced Robotic Assistance for Precision in Joint Replacement Surgery

Smith & Nephew, Inc., 2024

Computer-assisted surgical system that enhances surgical workflows by providing advanced tools and techniques to improve total joint replacement procedures. The system uses a combination of imaging, robotics, and data analysis to enable more accurate, efficient, and customized joint replacements. Features include: 1. Robotic arm with electromagnetic tracking for guided bone preparation and implantation. 2. Point probe device for high-resolution imaging of critical areas during hip surgeries. 3. Multi-sensor navigation system for accurate tracking of bone fixation devices. 4. Registration of pre-operative data to patient anatomy using the point probe. 5. 3D modeling of patient anatomy from bi-planar images. 6. Automated optimization of surgical parameters using historical data and patient goals.

19. AI and Patterned Light Beam Technology for Markerless Surgical Tracking and Guidance

Future Health Works Ltd., 2024

Minimally invasive, high precision tracking and guiding of an object like a body part during surgery without large markers. It uses a patterned light beam projected onto the object contour, image processing with AI, and machine learning to register, track, and guide the object without invasive marker placement. The light beam is segmented, quantized, reconstructed, transformed, concatenated, normalized, and analyzed by ML to determine object pose.

US20240081917A1-patent-drawing

20. Robotic-Assisted Minimally Invasive Customization of Surgical Implants

IX Innovation LLC, 2024

Automated customization of surgical implants to enable less invasive installation using robotic surgery. The method involves using imaging data to identify less invasive installation paths and dimensions for customized implant components. The implant is virtually segmented into components that can be passed through the identified routes. The components are simulated moving and assembled at the implant site. Modifications are made if components can't pass or assemble. This allows minimally invasive robotic installation of customized implants through less damaging paths.

US11918296B1-patent-drawing

21. Autonomous Navigation System for Steerable Surgical Devices

22. AI-Powered Intraoperative Guidance System for Improved Surgical Outcomes

23. Machine Learning-Assisted Surgical Planning for Orthopedic Procedures

24. AI-Optimized Path Planning and Implant Design for Robotic Surgery Assistance

25. Neural Network-Based Motion Error Detection in Surgical Robots

Request the full report with complete details of these

+57 patents for offline reading.