Read the First Edition of The Lens, a new magazine by LSI arrow-icon

Douglas Teany Presents Method AI at LSI USA '24

Method AI is developing a surgical navigation platform to drive improved patient outcomes for robotic oncology procedures.
Speakers
Doug Teany
Doug Teany
Method AI

Doug Teany  0:03  
Thank you. My name is Doug Teany. I'm co founder and CEO of Method AI. At Method AI, we're on a mission to make surgeons more effective. We believe it's based in robotics, but dependent on better inter operative imaging. So today I'm going to introduce an imaging platform that we've developed a method and talk about how we're leveraging that to build a live 3d subsurface surgical map to guide physicians to better outcomes. We're starting where we can have a meaningful patient outcome gain and surgical oncology Today, roughly one out of five prostate procedures result in cancer being left behind. The failure rate for ovarian procedures is almost double that and for organs like the kidneys, more than half of the procedures result in total organ loss. So we think there's a lot of room for improvement here. Here's the problem. Physicians cannot see what they're cutting out today. Despite the brilliant endoscope view they get with today's robotic systems. Tumors can be in Diffic, and lie completely below the visible surface of the organ. They can push five to seven centimeters deep into the organ, they have irregular geometries, and they can come dangerously close to critical features like arteries and nerves that can create complications during the procedure or affect long term organ function if something goes wrong during the case. So to address those challenges, we think the robotic console view should look something like this, you should preserve that brilliant endoscope view and Methodist developing an image guided surgical navigation system that will provide a split screen view that shows the physician what lies below the surface will accurately segment the tumor will wrap a recommended surgical plan around that tumor to guide the physician through effective resection. And we'll highlight critical features in this case like arteries that either need to be avoided or dealt with as part of the procedure to get the best possible outcome. If you haven't leaped there in your head already, the technology behind the map is ultrasound. So we've developed a new and novel 3d Ultrasound platform for minimally invasive surgery. So this passes through a surgical trocar it creates a high resolution large field of view that can image full tumors up to seven centimeters. And perhaps its most competitive feature is that it can run for extended periods of time without heating up which today's 3d transducers can't do, which allows us to do something quite unique with it. So the picture and picture is a video which demonstrates how we are attaching this ultrasound transducer to the target surgical organ with a vacuum boot disposable to create a continuous imaging scenario. Once you have continuous high resolution, uninterrupted ultrasound and surgery, you open up all kinds of possibilities for things that surgeons cannot see today. The video on the bottom is the image quality that we have achieved with this device which is presently better than CT quality. It also illustrates the accuracy of our segmentation and tracking algorithms. So that is a roughly two and a half centimeter tumor completely into FIDIC and a kidney model so below the visible surface of the Oregon and we are manipulating that Oregon as a surgeon would manipulate in Oregon in surgery and holding that tumor the whole time. So this is the makings of what would be an industry first of alive, deep tissue surgical map for soft tissue procedures. No CT overlay required no registration with preoperative image. It is generated from the live signal from our attached transducer and will maintain pace with the procedure in real time, up to and including excision of the tumor. This is what the product looks like. Like any ultrasound system, it's a cart based system most of the magic of our system happens at the patient where the surgical transducer is introduced and attached to the target organ. To maximize our commercial opportunity. We are intentionally remaining robotic system agnostic and CO locating a touchscreen next to the robotic console for the physician to interact with method system. And then we will port that four D deep tissue surgical map into the robotic console view through standard third party integration capabilities available in most robotic systems a little bit about why we think this is different than competitive so new imaging device catching it with a vacuum boo, a new class of ultrasound algorithms but what we're really attempting to do is bridge that gap between enhanced visualization and surgical execution. What most surgeons will tell you is they need to see more I need to see the the the margins of the tumor in real time and that will improve my outcomes. What we've attempted to do is bridge that into what they actually do with their hands. So we're pushing deep with ultrasound. Rather than giving an enhanced view to what they already see in the endoscope. We're providing a full surgical plan rather than providing incremental information again through the endoscope, and we're doing real time proactive surgical guidance to avoid those complications and to achieve full cancer removal compared to the standard of care, which is cutting check capabilities. In the end, we believe high resolution ultrasound is the key to unlock automation possibilities for soft tissue robotics. And in doing so, because this imaging platform doesn't exist and will be used in clinical cases, we'll create a data set that we think has perhaps more long term value than the product itself for future AI indications. There simply this data doesn't exist in the world 3d Live surgical imagery that can help inform future cases and improve the patient care continuum. In terms of value creation, we are focused first and foremost on the patient, we believe we can drive upwards of a 70% improvement in cancer free survival. This is based on the fact that the top 1% of surgeons are 70% better than the rest of the world with their outcomes. We believe if you provide a segmented surgical target, or recommended best outcome surgical plan, we can enable physicians to achieve the outcomes of the top 1%. There's a nice incremental revenue opportunity for providers that people buy in this system, we project three to 4 billion in the US alone. This is not based on the new DRG code, but based on sweeping surgically eligible patients back into the AOR that don't get surgery today, because physicians pass on their cases, because they're too complicated. We don't make the assumption, we'll get them all back, we assume only 20% of those cases are back into the surgical suite. Because for the first time a surgeon can see the complexities of the case below the surface of the organ and be guided through that procedure. There's a nice revenue opportunity here for method, upwards of $1 billion through a capital hybrid software as a service model, we're really leaning in on the software portion of the pro forma happy to share that with anyone after. And this is all about robotic adoption. If you can differentiate robotic outcomes you drive adoption and today robotic systems are only used on one to two of eligible procedures. We think that should look more like six to seven of procedures done that are eligible done robotically. We think the manufacturers would agree with us. Surgical Oncology is a really nice market, because we get to that patient outcome of cancer free survival roughly a million procedures a $13 billion reimbursement market, but it provides a nice opportunity for platform expansion. The imaging device I showed earlier is purpose built for the kidney. But more specifically, it's purpose built for solid organs, we intend to pivot it from the kidney to the liver. We have a second design for an a natural lumen prostate device, which will be an interesting urethral imaging device, which we think lends itself to other luminal organs. And we would pivot that prostate device to uterine wall fibroid cases as a next step. This is our team. We have a small cohort from Corinthos vascular robotics, we had an exit to Siemens Healthineers back in 2019. Learned a lot through that process. We're backed by a world leading physician with more than 5000 robotic cases under their belt. And here's a little bit of a summary to wrap. So, based on imaging, image guided surgical navigation, I think the thing I'd want you to leave the room with is that this capability we have of automating these procedures with highly accurate AI work. The patient outcome gain is critical. It creates clinical traction allows us to scale and grow. The nuances of this imaging technology are quite complex, it's highly differentiated, and it is not easy to follow or replicate. We think we have a really wide moat on getting high resolution 3d Ultrasound inside the body. And this is a really attractive multiple on invested capital opportunity for investors. We've developed a fairly capital efficient development program have a 510 K pathway defined through predicate device and a scalable recurring revenue model again, based in software with high margins I'll be around if anyone's interested in learning more about the Proform about the tech. Feel free to send me a note or schedule time and we can have a meeting. Thank you


 

LSI USA ‘25 is filling fast. Secure your spot today to join Medtech and Healthtech leaders.

March 17-21, 2025 Waldorf Astoria, Monarch Beach | Dana Point, CA Register arrow